Note: This is test shard 1 of 8.
[==========] Running 2 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 1 test from MaintenanceModeRF3ITest
[ RUN ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate
2026-05-02T14:06:06Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-02T14:06:06Z Disabled control of system clock
WARNING: Logging before InitGoogleLogging() is written to STDERR
I20260502 14:06:06.218830 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.12.158.126:43627
--webserver_interface=127.12.158.126
--webserver_port=0
--builtin_ntp_servers=127.12.158.84:36513
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.12.158.126:43627 with env {}
W20260502 14:06:06.292157 12929 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:06.292325 12929 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:06.292343 12929 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:06.293720 12929 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260502 14:06:06.293753 12929 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:06.293766 12929 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260502 14:06:06.293778 12929 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260502 14:06:06.295218 12929 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:36513
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.12.158.126:43627
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.12.158.126:43627
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.12.158.126
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.12929
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:06.295387 12929 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:06.295552 12929 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260502 14:06:06.297879 12937 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:06.297873 12934 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:06.298013 12929 server_base.cc:1061] running on GCE node
W20260502 14:06:06.297909 12935 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:06.298305 12929 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:06.298604 12929 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:06.299800 12929 hybrid_clock.cc:648] HybridClock initialized: now 1777730766299730 us; error 31 us; skew 500 ppm
I20260502 14:06:06.300951 12929 webserver.cc:492] Webserver started at http://127.12.158.126:35229/ using document root <none> and password file <none>
I20260502 14:06:06.301149 12929 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:06.301211 12929 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:06.301311 12929 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260502 14:06:06.302119 12929 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data/instance:
uuid: "91962b7baa9544389d582013c70a54a1"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.302431 12929 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal/instance:
uuid: "91962b7baa9544389d582013c70a54a1"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.303586 12929 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.002s sys 0.000s
I20260502 14:06:06.304353 12943 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.304519 12929 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20260502 14:06:06.304622 12929 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal
uuid: "91962b7baa9544389d582013c70a54a1"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.304700 12929 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:06.323010 12929 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:06.323308 12929 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:06.323459 12929 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:06.326941 12929 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.126:43627
I20260502 14:06:06.326968 12995 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.126:43627 every 8 connection(s)
I20260502 14:06:06.327267 12929 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data/info.pb
I20260502 14:06:06.327898 12996 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.329958 12996 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1: Bootstrap starting.
I20260502 14:06:06.330533 12996 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.330876 12996 log.cc:826] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:06.331437 12996 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1: No bootstrap required, opened a new log
I20260502 14:06:06.332288 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 12929
I20260502 14:06:06.332357 12921 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal/instance
I20260502 14:06:06.332647 12996 raft_consensus.cc:359] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } }
I20260502 14:06:06.332769 12996 raft_consensus.cc:385] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.332808 12996 raft_consensus.cc:740] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 91962b7baa9544389d582013c70a54a1, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.332929 12996 consensus_queue.cc:260] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } }
I20260502 14:06:06.333005 12996 raft_consensus.cc:399] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260502 14:06:06.333045 12996 raft_consensus.cc:493] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260502 14:06:06.333103 12996 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.333786 12996 raft_consensus.cc:515] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } }
I20260502 14:06:06.333914 12996 leader_election.cc:304] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 91962b7baa9544389d582013c70a54a1; no voters:
I20260502 14:06:06.334046 12996 leader_election.cc:290] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [CANDIDATE]: Term 1 election: Requested vote from peers
I20260502 14:06:06.334081 13001 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 1 FOLLOWER]: Leader election won for term 1
I20260502 14:06:06.334192 13001 raft_consensus.cc:697] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 1 LEADER]: Becoming Leader. State: Replica: 91962b7baa9544389d582013c70a54a1, State: Running, Role: LEADER
I20260502 14:06:06.334256 12996 sys_catalog.cc:565] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [sys.catalog]: configured and running, proceeding with master startup.
I20260502 14:06:06.335237 13001 consensus_queue.cc:237] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } }
I20260502 14:06:06.336413 13002 sys_catalog.cc:455] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 91962b7baa9544389d582013c70a54a1. Latest consensus state: current_term: 1 leader_uuid: "91962b7baa9544389d582013c70a54a1" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } } }
I20260502 14:06:06.336428 13003 sys_catalog.cc:455] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "91962b7baa9544389d582013c70a54a1" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } } }
I20260502 14:06:06.336515 13003 sys_catalog.cc:458] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [sys.catalog]: This master's current role is: LEADER
I20260502 14:06:06.336511 13002 sys_catalog.cc:458] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [sys.catalog]: This master's current role is: LEADER
W20260502 14:06:06.337509 13015 catalog_manager.cc:1568] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1: loading cluster ID for follower catalog manager: Not found: cluster ID entry not found
W20260502 14:06:06.338459 13015 catalog_manager.cc:883] Not found: cluster ID entry not found: failed to prepare follower catalog manager, will retry
I20260502 14:06:06.338549 13016 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260502 14:06:06.338706 13016 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260502 14:06:06.339859 13016 catalog_manager.cc:1357] Generated new cluster ID: d5274adbe7e045b8a0612b77419235d1
I20260502 14:06:06.339903 13016 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260502 14:06:06.344524 13016 catalog_manager.cc:1380] Generated new certificate authority record
I20260502 14:06:06.344956 13016 catalog_manager.cc:1514] Loading token signing keys...
I20260502 14:06:06.353227 13016 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1: Generated new TSK 0
I20260502 14:06:06.353379 13016 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260502 14:06:06.357510 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.65:0
--local_ip_for_outbound_sockets=127.12.158.65
--webserver_interface=127.12.158.65
--webserver_port=0
--tserver_master_addrs=127.12.158.126:43627
--builtin_ntp_servers=127.12.158.84:36513
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20260502 14:06:06.430734 13020 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:06.430893 13020 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:06.430909 13020 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260502 14:06:06.430923 13020 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:06.432400 13020 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:06.432444 13020 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.65
I20260502 14:06:06.434783 13020 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:36513
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.65:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.12.158.65
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.12.158.126:43627
--never_fsync=true
--heap_profile_path=/tmp/kudu.13020
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.65
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:06.434998 13020 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:06.435218 13020 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:06.436038 13020 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:06.437785 13025 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:06.437847 13028 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:06.437925 13026 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:06.438163 13020 server_base.cc:1061] running on GCE node
I20260502 14:06:06.438315 13020 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:06.438548 13020 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:06.439692 13020 hybrid_clock.cc:648] HybridClock initialized: now 1777730766439681 us; error 41 us; skew 500 ppm
I20260502 14:06:06.440843 13020 webserver.cc:492] Webserver started at http://127.12.158.65:43375/ using document root <none> and password file <none>
I20260502 14:06:06.441056 13020 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:06.441098 13020 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:06.441202 13020 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260502 14:06:06.441969 13020 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data/instance:
uuid: "d6662a7688c84665b55623d3f97e4ae3"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.442252 13020 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal/instance:
uuid: "d6662a7688c84665b55623d3f97e4ae3"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.443368 13020 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.002s sys 0.000s
I20260502 14:06:06.444090 13034 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.444228 13020 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.444288 13020 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal
uuid: "d6662a7688c84665b55623d3f97e4ae3"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.444353 13020 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:06.461542 13020 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:06.461796 13020 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:06.461921 13020 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:06.462148 13020 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:06.462507 13020 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:06.462540 13020 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.462561 13020 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:06.462574 13020 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.469461 13020 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.65:44257
I20260502 14:06:06.469539 13147 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.65:44257 every 8 connection(s)
I20260502 14:06:06.469830 13020 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
I20260502 14:06:06.471376 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 13020
I20260502 14:06:06.471459 12921 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal/instance
I20260502 14:06:06.472621 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.66:0
--local_ip_for_outbound_sockets=127.12.158.66
--webserver_interface=127.12.158.66
--webserver_port=0
--tserver_master_addrs=127.12.158.126:43627
--builtin_ntp_servers=127.12.158.84:36513
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20260502 14:06:06.474756 13148 heartbeater.cc:344] Connected to a master server at 127.12.158.126:43627
I20260502 14:06:06.474846 13148 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:06.475041 13148 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:06.475502 12960 ts_manager.cc:194] Registered new tserver with Master: d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257)
I20260502 14:06:06.476195 12960 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.65:54177
W20260502 14:06:06.556087 13151 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:06.556237 13151 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:06.556254 13151 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260502 14:06:06.556268 13151 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:06.557919 13151 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:06.557977 13151 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.66
I20260502 14:06:06.559578 13151 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:36513
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.66:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.12.158.66
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.12.158.126:43627
--never_fsync=true
--heap_profile_path=/tmp/kudu.13151
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.66
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:06.559830 13151 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:06.560062 13151 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:06.560693 13151 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:06.562480 13156 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:06.562489 13157 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:06.562489 13159 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:06.562777 13151 server_base.cc:1061] running on GCE node
I20260502 14:06:06.562906 13151 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:06.563082 13151 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:06.564236 13151 hybrid_clock.cc:648] HybridClock initialized: now 1777730766564216 us; error 27 us; skew 500 ppm
I20260502 14:06:06.565340 13151 webserver.cc:492] Webserver started at http://127.12.158.66:41457/ using document root <none> and password file <none>
I20260502 14:06:06.565562 13151 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:06.565630 13151 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:06.565794 13151 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260502 14:06:06.566761 13151 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/data/instance:
uuid: "44217426b1314bc9824ba6f1661fb95b"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.567066 13151 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/wal/instance:
uuid: "44217426b1314bc9824ba6f1661fb95b"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.568292 13151 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.003s sys 0.000s
I20260502 14:06:06.568994 13165 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.569175 13151 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20260502 14:06:06.569250 13151 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/wal
uuid: "44217426b1314bc9824ba6f1661fb95b"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.569321 13151 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:06.605336 13151 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:06.605639 13151 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:06.605780 13151 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:06.605997 13151 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:06.606333 13151 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:06.606386 13151 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.606427 13151 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:06.606473 13151 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.612099 13151 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.66:33299
I20260502 14:06:06.612149 13278 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.66:33299 every 8 connection(s)
I20260502 14:06:06.612440 13151 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
I20260502 14:06:06.616940 13279 heartbeater.cc:344] Connected to a master server at 127.12.158.126:43627
I20260502 14:06:06.617008 13279 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:06.617180 13279 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:06.617551 12960 ts_manager.cc:194] Registered new tserver with Master: 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:06.617914 12960 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.66:38101
I20260502 14:06:06.618256 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 13151
I20260502 14:06:06.618341 12921 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-1/wal/instance
I20260502 14:06:06.619235 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.67:0
--local_ip_for_outbound_sockets=127.12.158.67
--webserver_interface=127.12.158.67
--webserver_port=0
--tserver_master_addrs=127.12.158.126:43627
--builtin_ntp_servers=127.12.158.84:36513
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20260502 14:06:06.692567 13282 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:06.692739 13282 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:06.692765 13282 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260502 14:06:06.692786 13282 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:06.694253 13282 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:06.694361 13282 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.67
I20260502 14:06:06.695858 13282 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:36513
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.67:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.12.158.67
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.12.158.126:43627
--never_fsync=true
--heap_profile_path=/tmp/kudu.13282
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.67
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:06.696089 13282 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:06.696322 13282 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:06.696951 13282 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:06.698959 13288 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:06.699007 13287 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:06.699011 13290 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:06.699774 13282 server_base.cc:1061] running on GCE node
I20260502 14:06:06.699932 13282 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:06.700083 13282 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:06.701218 13282 hybrid_clock.cc:648] HybridClock initialized: now 1777730766701192 us; error 39 us; skew 500 ppm
I20260502 14:06:06.702262 13282 webserver.cc:492] Webserver started at http://127.12.158.67:35421/ using document root <none> and password file <none>
I20260502 14:06:06.702414 13282 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:06.702450 13282 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:06.702521 13282 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260502 14:06:06.703474 13282 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/data/instance:
uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.703745 13282 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/wal/instance:
uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.705015 13282 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:06.705672 13296 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.705864 13282 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20260502 14:06:06.705947 13282 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/wal
uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.706022 13282 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:06.715198 13282 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:06.715443 13282 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:06.715572 13282 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:06.715813 13282 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:06.716109 13282 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:06.716161 13282 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.716202 13282 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:06.716226 13282 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.721962 13282 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.67:33939
I20260502 14:06:06.722007 13409 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.67:33939 every 8 connection(s)
I20260502 14:06:06.722288 13282 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
I20260502 14:06:06.723132 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 13282
I20260502 14:06:06.723243 12921 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-2/wal/instance
I20260502 14:06:06.726464 13410 heartbeater.cc:344] Connected to a master server at 127.12.158.126:43627
I20260502 14:06:06.726548 13410 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:06.726734 13410 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:06.727067 12960 ts_manager.cc:194] Registered new tserver with Master: a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:06.727435 12960 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.67:43995
I20260502 14:06:06.735317 12921 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20260502 14:06:06.746963 12960 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:60382:
name: "test-workload"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
rows: "<redacted>""\004\001\000UUU\025\004\001\000\252\252\252*\004\001\000\377\377\377?\004\001\000TUUU\004\001\000\251\252\252j"
indirect_data: "<redacted>"""
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
W20260502 14:06:06.747256 12960 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-workload in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260502 14:06:06.754115 13212 tablet_service.cc:1511] Processing CreateTablet for tablet 1fd859f7e3144efcacca4175c53ef5df (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20260502 14:06:06.754117 13213 tablet_service.cc:1511] Processing CreateTablet for tablet d4c02bd85161406f918b1aee009c7294 (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION VALUES < 357913941
I20260502 14:06:06.754343 13082 tablet_service.cc:1511] Processing CreateTablet for tablet d4c02bd85161406f918b1aee009c7294 (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION VALUES < 357913941
I20260502 14:06:06.754508 13212 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1fd859f7e3144efcacca4175c53ef5df. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.754642 13082 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet d4c02bd85161406f918b1aee009c7294. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.756148 13339 tablet_service.cc:1511] Processing CreateTablet for tablet be812ec5b6a24e06bd0e507df248605e (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20260502 14:06:06.756408 13339 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet be812ec5b6a24e06bd0e507df248605e. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.756430 13211 tablet_service.cc:1511] Processing CreateTablet for tablet b175587f4e6a42a6b7e4740bae788705 (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20260502 14:06:06.756497 13210 tablet_service.cc:1511] Processing CreateTablet for tablet bf9cb1779b0345fe868c2b551293512a (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20260502 14:06:06.756556 13209 tablet_service.cc:1511] Processing CreateTablet for tablet 7e3a9b987a4e4c74b1f1af634b2f81ed (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20260502 14:06:06.756618 13208 tablet_service.cc:1511] Processing CreateTablet for tablet be812ec5b6a24e06bd0e507df248605e (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20260502 14:06:06.756763 13211 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b175587f4e6a42a6b7e4740bae788705. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.757067 13430 tablet_bootstrap.cc:492] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
I20260502 14:06:06.757650 13430 tablet_bootstrap.cc:654] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.757884 13210 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet bf9cb1779b0345fe868c2b551293512a. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.757908 13430 log.cc:826] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:06.758005 13209 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 7e3a9b987a4e4c74b1f1af634b2f81ed. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.758682 13432 tablet_bootstrap.cc:492] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430: Bootstrap starting.
I20260502 14:06:06.758744 13340 tablet_service.cc:1511] Processing CreateTablet for tablet 7e3a9b987a4e4c74b1f1af634b2f81ed (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20260502 14:06:06.758821 13340 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 7e3a9b987a4e4c74b1f1af634b2f81ed. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.759086 13208 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet be812ec5b6a24e06bd0e507df248605e. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.759224 13432 tablet_bootstrap.cc:654] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.759479 13432 log.cc:826] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:06.759934 13081 tablet_service.cc:1511] Processing CreateTablet for tablet 1fd859f7e3144efcacca4175c53ef5df (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20260502 14:06:06.760030 13081 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1fd859f7e3144efcacca4175c53ef5df. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.760102 13213 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet d4c02bd85161406f918b1aee009c7294. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.760900 13080 tablet_service.cc:1511] Processing CreateTablet for tablet b175587f4e6a42a6b7e4740bae788705 (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20260502 14:06:06.760990 13080 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b175587f4e6a42a6b7e4740bae788705. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.761163 13429 tablet_bootstrap.cc:492] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b: Bootstrap starting.
I20260502 14:06:06.761826 13079 tablet_service.cc:1511] Processing CreateTablet for tablet bf9cb1779b0345fe868c2b551293512a (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20260502 14:06:06.761885 13079 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet bf9cb1779b0345fe868c2b551293512a. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.761963 13429 tablet_bootstrap.cc:654] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.762154 13429 log.cc:826] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:06.762673 13429 tablet_bootstrap.cc:492] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b: No bootstrap required, opened a new log
I20260502 14:06:06.762673 13430 tablet_bootstrap.cc:492] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: No bootstrap required, opened a new log
I20260502 14:06:06.762723 13429 ts_tablet_manager.cc:1403] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b: Time spent bootstrapping tablet: real 0.002s user 0.001s sys 0.000s
I20260502 14:06:06.762723 13430 ts_tablet_manager.cc:1403] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.006s user 0.001s sys 0.000s
I20260502 14:06:06.762784 13078 tablet_service.cc:1511] Processing CreateTablet for tablet 7e3a9b987a4e4c74b1f1af634b2f81ed (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20260502 14:06:06.762885 13078 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 7e3a9b987a4e4c74b1f1af634b2f81ed. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.763645 13077 tablet_service.cc:1511] Processing CreateTablet for tablet be812ec5b6a24e06bd0e507df248605e (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20260502 14:06:06.763816 13077 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet be812ec5b6a24e06bd0e507df248605e. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.763902 13341 tablet_service.cc:1511] Processing CreateTablet for tablet bf9cb1779b0345fe868c2b551293512a (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20260502 14:06:06.764001 13341 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet bf9cb1779b0345fe868c2b551293512a. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.764331 13429 raft_consensus.cc:359] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.764458 13429 raft_consensus.cc:385] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.764487 13429 raft_consensus.cc:740] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 44217426b1314bc9824ba6f1661fb95b, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.764426 13430 raft_consensus.cc:359] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:06.764518 13430 raft_consensus.cc:385] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.764544 13430 raft_consensus.cc:740] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.764593 13429 consensus_queue.cc:260] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.764629 13430 consensus_queue.cc:260] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:06.764837 13430 ts_tablet_manager.cc:1434] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.002s user 0.000s sys 0.002s
I20260502 14:06:06.764837 13429 ts_tablet_manager.cc:1434] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b: Time spent starting tablet: real 0.002s user 0.000s sys 0.002s
I20260502 14:06:06.764933 13430 tablet_bootstrap.cc:492] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
I20260502 14:06:06.764935 13279 heartbeater.cc:499] Master 127.12.158.126:43627 was elected leader, sending a full tablet report...
I20260502 14:06:06.765064 13148 heartbeater.cc:499] Master 127.12.158.126:43627 was elected leader, sending a full tablet report...
I20260502 14:06:06.765210 13342 tablet_service.cc:1511] Processing CreateTablet for tablet b175587f4e6a42a6b7e4740bae788705 (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20260502 14:06:06.765290 13342 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b175587f4e6a42a6b7e4740bae788705. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.765406 13430 tablet_bootstrap.cc:654] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.765847 13429 tablet_bootstrap.cc:492] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b: Bootstrap starting.
I20260502 14:06:06.766155 13429 tablet_bootstrap.cc:654] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.766304 13343 tablet_service.cc:1511] Processing CreateTablet for tablet 1fd859f7e3144efcacca4175c53ef5df (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20260502 14:06:06.766364 13343 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1fd859f7e3144efcacca4175c53ef5df. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.766899 13344 tablet_service.cc:1511] Processing CreateTablet for tablet d4c02bd85161406f918b1aee009c7294 (DEFAULT_TABLE table=test-workload [id=0eed4d37dcb4435eb14316140701d979]), partition=RANGE (key) PARTITION VALUES < 357913941
I20260502 14:06:06.766983 13429 tablet_bootstrap.cc:492] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b: No bootstrap required, opened a new log
I20260502 14:06:06.766991 13344 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet d4c02bd85161406f918b1aee009c7294. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:06.767014 13429 ts_tablet_manager.cc:1403] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b: Time spent bootstrapping tablet: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:06.767112 13429 raft_consensus.cc:359] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.767148 13429 raft_consensus.cc:385] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.767161 13429 raft_consensus.cc:740] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 44217426b1314bc9824ba6f1661fb95b, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.767217 13429 consensus_queue.cc:260] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.767355 13429 ts_tablet_manager.cc:1434] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.767392 13429 tablet_bootstrap.cc:492] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b: Bootstrap starting.
I20260502 14:06:06.767688 13429 tablet_bootstrap.cc:654] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.768225 13430 tablet_bootstrap.cc:492] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3: No bootstrap required, opened a new log
I20260502 14:06:06.768358 13430 ts_tablet_manager.cc:1403] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.003s user 0.000s sys 0.001s
I20260502 14:06:06.768504 13430 raft_consensus.cc:359] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.768563 13430 raft_consensus.cc:385] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.768590 13430 raft_consensus.cc:740] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.768625 13430 consensus_queue.cc:260] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.768759 13430 ts_tablet_manager.cc:1434] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.768905 13430 tablet_bootstrap.cc:492] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
I20260502 14:06:06.768941 13429 tablet_bootstrap.cc:492] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b: No bootstrap required, opened a new log
I20260502 14:06:06.768973 13429 ts_tablet_manager.cc:1403] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b: Time spent bootstrapping tablet: real 0.002s user 0.001s sys 0.000s
I20260502 14:06:06.769227 13429 raft_consensus.cc:359] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.769299 13429 raft_consensus.cc:385] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.769320 13430 tablet_bootstrap.cc:654] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.769337 13429 raft_consensus.cc:740] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 44217426b1314bc9824ba6f1661fb95b, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.769384 13429 consensus_queue.cc:260] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.769483 13429 ts_tablet_manager.cc:1434] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.769551 13429 tablet_bootstrap.cc:492] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b: Bootstrap starting.
I20260502 14:06:06.769665 13436 raft_consensus.cc:493] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:06.769745 13432 tablet_bootstrap.cc:492] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430: No bootstrap required, opened a new log
I20260502 14:06:06.769729 13436 raft_consensus.cc:515] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.769800 13432 ts_tablet_manager.cc:1403] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent bootstrapping tablet: real 0.011s user 0.001s sys 0.000s
I20260502 14:06:06.769924 13429 tablet_bootstrap.cc:654] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.770346 13436 leader_election.cc:290] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:06.771433 13429 tablet_bootstrap.cc:492] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b: No bootstrap required, opened a new log
I20260502 14:06:06.771493 13429 ts_tablet_manager.cc:1403] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b: Time spent bootstrapping tablet: real 0.002s user 0.001s sys 0.000s
I20260502 14:06:06.771445 13432 raft_consensus.cc:359] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.771552 13432 raft_consensus.cc:385] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.771579 13432 raft_consensus.cc:740] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a1ddd8ec7c554a1eadeb2aa58f7ab430, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.771646 13429 raft_consensus.cc:359] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.771699 13429 raft_consensus.cc:385] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.771719 13429 raft_consensus.cc:740] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 44217426b1314bc9824ba6f1661fb95b, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.771694 13432 consensus_queue.cc:260] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.771775 13429 consensus_queue.cc:260] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.771862 13429 ts_tablet_manager.cc:1434] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.771932 13429 tablet_bootstrap.cc:492] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b: Bootstrap starting.
I20260502 14:06:06.772096 13410 heartbeater.cc:499] Master 127.12.158.126:43627 was elected leader, sending a full tablet report...
I20260502 14:06:06.771920 13432 ts_tablet_manager.cc:1434] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent starting tablet: real 0.002s user 0.002s sys 0.000s
I20260502 14:06:06.772413 13429 tablet_bootstrap.cc:654] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.772467 13432 tablet_bootstrap.cc:492] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430: Bootstrap starting.
I20260502 14:06:06.772867 13432 tablet_bootstrap.cc:654] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.773765 13430 tablet_bootstrap.cc:492] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3: No bootstrap required, opened a new log
I20260502 14:06:06.773829 13430 ts_tablet_manager.cc:1403] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.005s user 0.000s sys 0.001s
I20260502 14:06:06.773927 13102 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b175587f4e6a42a6b7e4740bae788705" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d6662a7688c84665b55623d3f97e4ae3" is_pre_election: true
I20260502 14:06:06.773979 13430 raft_consensus.cc:359] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.774034 13430 raft_consensus.cc:385] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.774052 13430 raft_consensus.cc:740] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.774087 13430 consensus_queue.cc:260] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.774175 13430 ts_tablet_manager.cc:1434] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.774190 13102 raft_consensus.cc:2468] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 44217426b1314bc9824ba6f1661fb95b in term 0.
I20260502 14:06:06.774223 13430 tablet_bootstrap.cc:492] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
I20260502 14:06:06.774427 13169 leader_election.cc:304] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, d6662a7688c84665b55623d3f97e4ae3; no voters:
I20260502 14:06:06.774546 13436 raft_consensus.cc:2804] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260502 14:06:06.774585 13436 raft_consensus.cc:493] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:06:06.774616 13436 raft_consensus.cc:3060] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.774659 13430 tablet_bootstrap.cc:654] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.774657 13432 tablet_bootstrap.cc:492] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430: No bootstrap required, opened a new log
I20260502 14:06:06.774696 13432 ts_tablet_manager.cc:1403] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent bootstrapping tablet: real 0.002s user 0.001s sys 0.000s
I20260502 14:06:06.774855 13432 raft_consensus.cc:359] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.774916 13432 raft_consensus.cc:385] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.774935 13432 raft_consensus.cc:740] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a1ddd8ec7c554a1eadeb2aa58f7ab430, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.775069 13364 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b175587f4e6a42a6b7e4740bae788705" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
W20260502 14:06:06.775367 13166 leader_election.cc:343] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:06:06.775501 13432 consensus_queue.cc:260] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.775725 13432 ts_tablet_manager.cc:1434] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent starting tablet: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:06.775698 13436 raft_consensus.cc:515] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.775869 13432 tablet_bootstrap.cc:492] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430: Bootstrap starting.
I20260502 14:06:06.775890 13439 raft_consensus.cc:493] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:06.775954 13439 raft_consensus.cc:515] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.776013 13102 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b175587f4e6a42a6b7e4740bae788705" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d6662a7688c84665b55623d3f97e4ae3"
I20260502 14:06:06.776075 13102 raft_consensus.cc:3060] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.776077 13439 leader_election.cc:290] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:06.776268 13430 tablet_bootstrap.cc:492] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3: No bootstrap required, opened a new log
I20260502 14:06:06.776285 13364 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1fd859f7e3144efcacca4175c53ef5df" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:06.776327 13430 ts_tablet_manager.cc:1403] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.002s user 0.000s sys 0.001s
W20260502 14:06:06.776446 13166 leader_election.cc:343] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:06:06.776561 13430 raft_consensus.cc:359] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.776599 13430 raft_consensus.cc:385] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.776612 13430 raft_consensus.cc:740] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.776665 13430 consensus_queue.cc:260] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.776757 13432 tablet_bootstrap.cc:654] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.776819 13430 ts_tablet_manager.cc:1434] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.000s user 0.000s sys 0.001s
I20260502 14:06:06.776866 13430 tablet_bootstrap.cc:492] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
I20260502 14:06:06.776876 13102 raft_consensus.cc:2468] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 44217426b1314bc9824ba6f1661fb95b in term 1.
I20260502 14:06:06.777227 13169 leader_election.cc:304] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, d6662a7688c84665b55623d3f97e4ae3; no voters:
I20260502 14:06:06.777261 13430 tablet_bootstrap.cc:654] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.777460 13102 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1fd859f7e3144efcacca4175c53ef5df" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d6662a7688c84665b55623d3f97e4ae3" is_pre_election: true
I20260502 14:06:06.777524 13102 raft_consensus.cc:2468] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 44217426b1314bc9824ba6f1661fb95b in term 0.
I20260502 14:06:06.777696 13169 leader_election.cc:304] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, d6662a7688c84665b55623d3f97e4ae3; no voters: a1ddd8ec7c554a1eadeb2aa58f7ab430
I20260502 14:06:06.777745 13432 tablet_bootstrap.cc:492] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430: No bootstrap required, opened a new log
I20260502 14:06:06.777787 13432 ts_tablet_manager.cc:1403] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent bootstrapping tablet: real 0.002s user 0.001s sys 0.000s
I20260502 14:06:06.777885 13439 raft_consensus.cc:2804] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Leader election won for term 1
I20260502 14:06:06.777915 13432 raft_consensus.cc:359] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.777988 13432 raft_consensus.cc:385] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.778004 13439 raft_consensus.cc:697] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 1 LEADER]: Becoming Leader. State: Replica: 44217426b1314bc9824ba6f1661fb95b, State: Running, Role: LEADER
I20260502 14:06:06.778017 13432 raft_consensus.cc:740] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a1ddd8ec7c554a1eadeb2aa58f7ab430, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.778062 13429 tablet_bootstrap.cc:492] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b: No bootstrap required, opened a new log
I20260502 14:06:06.778074 13432 consensus_queue.cc:260] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.778095 13439 consensus_queue.cc:237] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.778149 13429 ts_tablet_manager.cc:1403] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b: Time spent bootstrapping tablet: real 0.006s user 0.001s sys 0.000s
I20260502 14:06:06.778163 13432 ts_tablet_manager.cc:1434] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.778220 13432 tablet_bootstrap.cc:492] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430: Bootstrap starting.
I20260502 14:06:06.778316 13429 raft_consensus.cc:359] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.778393 13429 raft_consensus.cc:385] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.778430 13429 raft_consensus.cc:740] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 44217426b1314bc9824ba6f1661fb95b, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.778432 13439 raft_consensus.cc:2804] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260502 14:06:06.778503 13439 raft_consensus.cc:493] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:06:06.778544 13439 raft_consensus.cc:3060] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.778630 13432 tablet_bootstrap.cc:654] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.779219 13439 raft_consensus.cc:515] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.779249 13432 tablet_bootstrap.cc:492] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430: No bootstrap required, opened a new log
I20260502 14:06:06.779289 13432 ts_tablet_manager.cc:1403] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent bootstrapping tablet: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:06.779315 13439 leader_election.cc:290] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 election: Requested vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:06.778504 13429 consensus_queue.cc:260] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.779449 13429 ts_tablet_manager.cc:1434] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b: Time spent starting tablet: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:06.779495 13429 tablet_bootstrap.cc:492] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b: Bootstrap starting.
I20260502 14:06:06.779464 13432 raft_consensus.cc:359] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.779520 13432 raft_consensus.cc:385] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.779543 13432 raft_consensus.cc:740] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a1ddd8ec7c554a1eadeb2aa58f7ab430, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.779582 13364 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1fd859f7e3144efcacca4175c53ef5df" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430"
I20260502 14:06:06.779598 13432 consensus_queue.cc:260] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.779503 12959 catalog_manager.cc:5671] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b reported cstate change: term changed from 0 to 1, leader changed from <none> to 44217426b1314bc9824ba6f1661fb95b (127.12.158.66). New cstate: current_term: 1 leader_uuid: "44217426b1314bc9824ba6f1661fb95b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } health_report { overall_health: UNKNOWN } } }
I20260502 14:06:06.779711 13432 ts_tablet_manager.cc:1434] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.779781 13432 tablet_bootstrap.cc:492] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430: Bootstrap starting.
I20260502 14:06:06.779899 13429 tablet_bootstrap.cc:654] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b: Neither blocks nor log segments found. Creating new log.
W20260502 14:06:06.779983 13166 leader_election.cc:343] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 election: Tablet error from VoteRequest() call to peer a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:06:06.780169 13432 tablet_bootstrap.cc:654] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.780220 13436 leader_election.cc:290] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 election: Requested vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:06.780342 13102 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1fd859f7e3144efcacca4175c53ef5df" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d6662a7688c84665b55623d3f97e4ae3"
I20260502 14:06:06.780406 13102 raft_consensus.cc:3060] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.780550 13364 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b175587f4e6a42a6b7e4740bae788705" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430"
I20260502 14:06:06.780620 13364 raft_consensus.cc:3060] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.780952 13102 raft_consensus.cc:2468] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 44217426b1314bc9824ba6f1661fb95b in term 1.
I20260502 14:06:06.781055 13435 raft_consensus.cc:493] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:06.781118 13435 raft_consensus.cc:515] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:06.781255 13169 leader_election.cc:304] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, d6662a7688c84665b55623d3f97e4ae3; no voters: a1ddd8ec7c554a1eadeb2aa58f7ab430
I20260502 14:06:06.781347 13436 raft_consensus.cc:2804] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Leader election won for term 1
I20260502 14:06:06.781369 13435 leader_election.cc:290] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939), 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:06.781432 13436 raft_consensus.cc:697] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 1 LEADER]: Becoming Leader. State: Replica: 44217426b1314bc9824ba6f1661fb95b, State: Running, Role: LEADER
I20260502 14:06:06.781450 13364 raft_consensus.cc:2468] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 44217426b1314bc9824ba6f1661fb95b in term 1.
I20260502 14:06:06.781478 13436 consensus_queue.cc:237] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.782148 13430 tablet_bootstrap.cc:492] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3: No bootstrap required, opened a new log
I20260502 14:06:06.782192 13430 ts_tablet_manager.cc:1403] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.005s user 0.000s sys 0.001s
I20260502 14:06:06.782321 13430 raft_consensus.cc:359] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.782383 13430 raft_consensus.cc:385] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.782402 13430 raft_consensus.cc:740] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.782437 13430 consensus_queue.cc:260] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.782518 13430 ts_tablet_manager.cc:1434] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.782564 13430 tablet_bootstrap.cc:492] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
I20260502 14:06:06.782574 13429 tablet_bootstrap.cc:492] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b: No bootstrap required, opened a new log
I20260502 14:06:06.782616 13429 ts_tablet_manager.cc:1403] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b: Time spent bootstrapping tablet: real 0.003s user 0.001s sys 0.000s
I20260502 14:06:06.782761 13429 raft_consensus.cc:359] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:06.782815 13429 raft_consensus.cc:385] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.782836 13429 raft_consensus.cc:740] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 44217426b1314bc9824ba6f1661fb95b, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.782876 13429 consensus_queue.cc:260] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:06.782958 13430 tablet_bootstrap.cc:654] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.782965 13429 ts_tablet_manager.cc:1434] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.782912 12960 catalog_manager.cc:5671] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b reported cstate change: term changed from 0 to 1, leader changed from <none> to 44217426b1314bc9824ba6f1661fb95b (127.12.158.66). New cstate: current_term: 1 leader_uuid: "44217426b1314bc9824ba6f1661fb95b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } health_report { overall_health: UNKNOWN } } }
I20260502 14:06:06.783430 13432 tablet_bootstrap.cc:492] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430: No bootstrap required, opened a new log
I20260502 14:06:06.783495 13432 ts_tablet_manager.cc:1403] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent bootstrapping tablet: real 0.004s user 0.001s sys 0.000s
I20260502 14:06:06.783646 13432 raft_consensus.cc:359] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:06.783717 13432 raft_consensus.cc:385] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.783741 13432 raft_consensus.cc:740] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a1ddd8ec7c554a1eadeb2aa58f7ab430, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.783812 13432 consensus_queue.cc:260] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:06.783896 13432 ts_tablet_manager.cc:1434] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent starting tablet: real 0.000s user 0.000s sys 0.001s
I20260502 14:06:06.783941 13432 tablet_bootstrap.cc:492] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430: Bootstrap starting.
I20260502 14:06:06.784467 13432 tablet_bootstrap.cc:654] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:06.784576 13233 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d4c02bd85161406f918b1aee009c7294" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:06.784657 13233 raft_consensus.cc:2468] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d6662a7688c84665b55623d3f97e4ae3 in term 0.
I20260502 14:06:06.784850 13037 leader_election.cc:304] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, d6662a7688c84665b55623d3f97e4ae3; no voters:
I20260502 14:06:06.784952 13435 raft_consensus.cc:2804] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260502 14:06:06.784996 13435 raft_consensus.cc:493] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:06:06.785022 13435 raft_consensus.cc:3060] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.785197 13430 tablet_bootstrap.cc:492] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3: No bootstrap required, opened a new log
I20260502 14:06:06.785231 13430 ts_tablet_manager.cc:1403] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.003s user 0.000s sys 0.001s
I20260502 14:06:06.785372 13430 raft_consensus.cc:359] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.785418 13430 raft_consensus.cc:385] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.785436 13430 raft_consensus.cc:740] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.785473 13430 consensus_queue.cc:260] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.785570 13430 ts_tablet_manager.cc:1434] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.785617 13432 tablet_bootstrap.cc:492] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430: No bootstrap required, opened a new log
I20260502 14:06:06.785657 13432 ts_tablet_manager.cc:1403] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent bootstrapping tablet: real 0.002s user 0.000s sys 0.001s
I20260502 14:06:06.785741 13435 raft_consensus.cc:515] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:06.785790 13432 raft_consensus.cc:359] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.785841 13432 raft_consensus.cc:385] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:06.785861 13432 raft_consensus.cc:740] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a1ddd8ec7c554a1eadeb2aa58f7ab430, State: Initialized, Role: FOLLOWER
I20260502 14:06:06.785851 13435 leader_election.cc:290] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 1 election: Requested vote from peers a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939), 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:06.785897 13432 consensus_queue.cc:260] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.785997 13432 ts_tablet_manager.cc:1434] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.786077 13233 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d4c02bd85161406f918b1aee009c7294" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b"
I20260502 14:06:06.786139 13233 raft_consensus.cc:3060] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.786851 13233 raft_consensus.cc:2468] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d6662a7688c84665b55623d3f97e4ae3 in term 1.
I20260502 14:06:06.787003 13037 leader_election.cc:304] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, d6662a7688c84665b55623d3f97e4ae3; no voters:
I20260502 14:06:06.787077 13435 raft_consensus.cc:2804] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Leader election won for term 1
I20260502 14:06:06.787153 13435 raft_consensus.cc:697] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 1 LEADER]: Becoming Leader. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Running, Role: LEADER
I20260502 14:06:06.787228 13435 consensus_queue.cc:237] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:06.787245 13364 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d4c02bd85161406f918b1aee009c7294" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:06.787297 13364 raft_consensus.cc:2468] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d6662a7688c84665b55623d3f97e4ae3 in term 0.
I20260502 14:06:06.787258 13363 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d4c02bd85161406f918b1aee009c7294" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430"
I20260502 14:06:06.787374 13363 raft_consensus.cc:3060] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.787889 12960 catalog_manager.cc:5671] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 reported cstate change: term changed from 0 to 1, leader changed from <none> to d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65). New cstate: current_term: 1 leader_uuid: "d6662a7688c84665b55623d3f97e4ae3" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } health_report { overall_health: HEALTHY } } }
I20260502 14:06:06.788049 13363 raft_consensus.cc:2468] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d6662a7688c84665b55623d3f97e4ae3 in term 1.
I20260502 14:06:06.798679 13449 raft_consensus.cc:493] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:06.798731 13449 raft_consensus.cc:515] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.798840 13449 leader_election.cc:290] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:06.799005 13233 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "7e3a9b987a4e4c74b1f1af634b2f81ed" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:06.799021 13363 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "7e3a9b987a4e4c74b1f1af634b2f81ed" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:06.799081 13363 raft_consensus.cc:2468] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d6662a7688c84665b55623d3f97e4ae3 in term 0.
I20260502 14:06:06.799081 13233 raft_consensus.cc:2468] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate d6662a7688c84665b55623d3f97e4ae3 in term 0.
I20260502 14:06:06.799216 13035 leader_election.cc:304] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a1ddd8ec7c554a1eadeb2aa58f7ab430, d6662a7688c84665b55623d3f97e4ae3; no voters:
I20260502 14:06:06.799419 13449 raft_consensus.cc:2804] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260502 14:06:06.799479 13449 raft_consensus.cc:493] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:06:06.799520 13449 raft_consensus.cc:3060] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.800084 13449 raft_consensus.cc:515] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.800231 13449 leader_election.cc:290] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 1 election: Requested vote from peers 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:06.800371 13233 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "7e3a9b987a4e4c74b1f1af634b2f81ed" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b"
I20260502 14:06:06.800377 13363 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "7e3a9b987a4e4c74b1f1af634b2f81ed" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430"
I20260502 14:06:06.800437 13233 raft_consensus.cc:3060] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.800437 13363 raft_consensus.cc:3060] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.800966 13363 raft_consensus.cc:2468] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d6662a7688c84665b55623d3f97e4ae3 in term 1.
I20260502 14:06:06.800966 13233 raft_consensus.cc:2468] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate d6662a7688c84665b55623d3f97e4ae3 in term 1.
I20260502 14:06:06.801141 13035 leader_election.cc:304] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a1ddd8ec7c554a1eadeb2aa58f7ab430, d6662a7688c84665b55623d3f97e4ae3; no voters:
I20260502 14:06:06.801265 13449 raft_consensus.cc:2804] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Leader election won for term 1
I20260502 14:06:06.801340 13449 raft_consensus.cc:697] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 1 LEADER]: Becoming Leader. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Running, Role: LEADER
I20260502 14:06:06.801390 13449 consensus_queue.cc:237] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.801947 12960 catalog_manager.cc:5671] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 reported cstate change: term changed from 0 to 1, leader changed from <none> to d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65). New cstate: current_term: 1 leader_uuid: "d6662a7688c84665b55623d3f97e4ae3" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } health_report { overall_health: UNKNOWN } } }
I20260502 14:06:06.804028 13445 raft_consensus.cc:493] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:06.804085 13445 raft_consensus.cc:515] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.804349 13445 leader_election.cc:290] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:06.807047 13102 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "bf9cb1779b0345fe868c2b551293512a" candidate_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d6662a7688c84665b55623d3f97e4ae3" is_pre_election: true
I20260502 14:06:06.807126 13102 raft_consensus.cc:2468] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a1ddd8ec7c554a1eadeb2aa58f7ab430 in term 0.
I20260502 14:06:06.807173 13233 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "bf9cb1779b0345fe868c2b551293512a" candidate_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:06.807255 13233 raft_consensus.cc:2468] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a1ddd8ec7c554a1eadeb2aa58f7ab430 in term 0.
I20260502 14:06:06.807294 13300 leader_election.cc:304] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a1ddd8ec7c554a1eadeb2aa58f7ab430, d6662a7688c84665b55623d3f97e4ae3; no voters:
I20260502 14:06:06.807416 13445 raft_consensus.cc:2804] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260502 14:06:06.807502 13445 raft_consensus.cc:493] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:06:06.807524 13445 raft_consensus.cc:3060] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.808094 13445 raft_consensus.cc:515] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.808223 13445 leader_election.cc:290] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 1 election: Requested vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:06.808444 13233 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "bf9cb1779b0345fe868c2b551293512a" candidate_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b"
I20260502 14:06:06.808450 13102 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "bf9cb1779b0345fe868c2b551293512a" candidate_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d6662a7688c84665b55623d3f97e4ae3"
I20260502 14:06:06.808519 13233 raft_consensus.cc:3060] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.808518 13102 raft_consensus.cc:3060] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.808959 13102 raft_consensus.cc:2468] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a1ddd8ec7c554a1eadeb2aa58f7ab430 in term 1.
I20260502 14:06:06.808965 13233 raft_consensus.cc:2468] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a1ddd8ec7c554a1eadeb2aa58f7ab430 in term 1.
I20260502 14:06:06.809134 13300 leader_election.cc:304] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a1ddd8ec7c554a1eadeb2aa58f7ab430, d6662a7688c84665b55623d3f97e4ae3; no voters:
I20260502 14:06:06.809216 13445 raft_consensus.cc:2804] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Leader election won for term 1
I20260502 14:06:06.809350 13445 raft_consensus.cc:697] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 LEADER]: Becoming Leader. State: Replica: a1ddd8ec7c554a1eadeb2aa58f7ab430, State: Running, Role: LEADER
I20260502 14:06:06.809437 13445 consensus_queue.cc:237] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.810086 12960 catalog_manager.cc:5671] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 reported cstate change: term changed from 0 to 1, leader changed from <none> to a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67). New cstate: current_term: 1 leader_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } health_report { overall_health: HEALTHY } } }
I20260502 14:06:06.818892 13445 raft_consensus.cc:493] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:06.818969 13445 raft_consensus.cc:515] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.819136 13445 leader_election.cc:290] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:06.819285 13233 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1fd859f7e3144efcacca4175c53ef5df" candidate_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:06.819314 13102 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1fd859f7e3144efcacca4175c53ef5df" candidate_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d6662a7688c84665b55623d3f97e4ae3" is_pre_election: true
I20260502 14:06:06.819401 13102 raft_consensus.cc:2393] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate a1ddd8ec7c554a1eadeb2aa58f7ab430 in current term 1: Already voted for candidate 44217426b1314bc9824ba6f1661fb95b in this term.
I20260502 14:06:06.819583 13300 leader_election.cc:304] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: a1ddd8ec7c554a1eadeb2aa58f7ab430; no voters: 44217426b1314bc9824ba6f1661fb95b, d6662a7688c84665b55623d3f97e4ae3
I20260502 14:06:06.819818 13445 raft_consensus.cc:3060] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.820349 13445 raft_consensus.cc:2749] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20260502 14:06:06.821678 13453 raft_consensus.cc:493] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:06.821734 13453 raft_consensus.cc:515] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.821859 13453 leader_election.cc:290] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:06.822000 13102 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "be812ec5b6a24e06bd0e507df248605e" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d6662a7688c84665b55623d3f97e4ae3" is_pre_election: true
I20260502 14:06:06.822073 13102 raft_consensus.cc:2468] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 44217426b1314bc9824ba6f1661fb95b in term 0.
I20260502 14:06:06.822207 13169 leader_election.cc:304] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, d6662a7688c84665b55623d3f97e4ae3; no voters:
I20260502 14:06:06.822322 13453 raft_consensus.cc:2804] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260502 14:06:06.822367 13453 raft_consensus.cc:493] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:06:06.822398 13453 raft_consensus.cc:3060] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.822638 13363 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "be812ec5b6a24e06bd0e507df248605e" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:06.822710 13363 raft_consensus.cc:2468] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 44217426b1314bc9824ba6f1661fb95b in term 0.
I20260502 14:06:06.823022 13453 raft_consensus.cc:515] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.823190 13453 leader_election.cc:290] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 election: Requested vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:06.823338 13363 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "be812ec5b6a24e06bd0e507df248605e" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430"
I20260502 14:06:06.823351 13102 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "be812ec5b6a24e06bd0e507df248605e" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d6662a7688c84665b55623d3f97e4ae3"
I20260502 14:06:06.823414 13102 raft_consensus.cc:3060] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.823416 13363 raft_consensus.cc:3060] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:06.823969 13102 raft_consensus.cc:2468] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 44217426b1314bc9824ba6f1661fb95b in term 1.
I20260502 14:06:06.824090 13363 raft_consensus.cc:2468] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 44217426b1314bc9824ba6f1661fb95b in term 1.
I20260502 14:06:06.824141 13169 leader_election.cc:304] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, d6662a7688c84665b55623d3f97e4ae3; no voters:
I20260502 14:06:06.824302 13453 raft_consensus.cc:2804] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Leader election won for term 1
I20260502 14:06:06.824385 13453 raft_consensus.cc:697] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 1 LEADER]: Becoming Leader. State: Replica: 44217426b1314bc9824ba6f1661fb95b, State: Running, Role: LEADER
I20260502 14:06:06.824462 13453 consensus_queue.cc:237] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:06.825109 12960 catalog_manager.cc:5671] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b reported cstate change: term changed from 0 to 1, leader changed from <none> to 44217426b1314bc9824ba6f1661fb95b (127.12.158.66). New cstate: current_term: 1 leader_uuid: "44217426b1314bc9824ba6f1661fb95b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } health_report { overall_health: UNKNOWN } } }
I20260502 14:06:06.828743 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.68:0
--local_ip_for_outbound_sockets=127.12.158.68
--webserver_interface=127.12.158.68
--webserver_port=0
--tserver_master_addrs=127.12.158.126:43627
--builtin_ntp_servers=127.12.158.84:36513
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20260502 14:06:06.841316 13363 raft_consensus.cc:1275] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Refusing update from remote peer 44217426b1314bc9824ba6f1661fb95b: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.841579 13453 consensus_queue.cc:1048] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Connected to new peer: Peer: permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260502 14:06:06.841940 13102 raft_consensus.cc:1275] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Refusing update from remote peer 44217426b1314bc9824ba6f1661fb95b: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.842655 13452 consensus_queue.cc:1048] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Connected to new peer: Peer: permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260502 14:06:06.843284 13364 raft_consensus.cc:1275] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Refusing update from remote peer 44217426b1314bc9824ba6f1661fb95b: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.843479 13101 raft_consensus.cc:1275] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Refusing update from remote peer 44217426b1314bc9824ba6f1661fb95b: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.843750 13453 consensus_queue.cc:1048] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Connected to new peer: Peer: permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260502 14:06:06.843833 13364 raft_consensus.cc:1275] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Refusing update from remote peer 44217426b1314bc9824ba6f1661fb95b: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.844034 13452 consensus_queue.cc:1048] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Connected to new peer: Peer: permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260502 14:06:06.844138 13452 consensus_queue.cc:1048] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Connected to new peer: Peer: permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260502 14:06:06.844986 13233 raft_consensus.cc:1275] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Refusing update from remote peer a1ddd8ec7c554a1eadeb2aa58f7ab430: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.845115 13363 raft_consensus.cc:1275] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Refusing update from remote peer d6662a7688c84665b55623d3f97e4ae3: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.845148 13232 raft_consensus.cc:1275] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Refusing update from remote peer d6662a7688c84665b55623d3f97e4ae3: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.845463 13449 consensus_queue.cc:1048] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260502 14:06:06.845510 13100 raft_consensus.cc:1275] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Refusing update from remote peer a1ddd8ec7c554a1eadeb2aa58f7ab430: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.845743 13232 raft_consensus.cc:1275] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Refusing update from remote peer d6662a7688c84665b55623d3f97e4ae3: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.845715 13445 consensus_queue.cc:1048] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Connected to new peer: Peer: permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260502 14:06:06.845985 13445 consensus_queue.cc:1048] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Connected to new peer: Peer: permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260502 14:06:06.846231 13435 consensus_queue.cc:1048] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260502 14:06:06.846366 13435 consensus_queue.cc:1048] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260502 14:06:06.846498 13364 raft_consensus.cc:1275] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Refusing update from remote peer d6662a7688c84665b55623d3f97e4ae3: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.847007 13449 consensus_queue.cc:1048] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260502 14:06:06.848089 13476 mvcc.cc:204] Tried to move back new op lower bound from 7281585220979781632 to 7281585220723654656. Current Snapshot: MvccSnapshot[applied={T|T < 7281585220979781632}]
I20260502 14:06:06.848266 13479 mvcc.cc:204] Tried to move back new op lower bound from 7281585220979781632 to 7281585220723654656. Current Snapshot: MvccSnapshot[applied={T|T < 7281585220979781632}]
I20260502 14:06:06.848505 13099 raft_consensus.cc:1275] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Refusing update from remote peer 44217426b1314bc9824ba6f1661fb95b: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:06.850068 13484 mvcc.cc:204] Tried to move back new op lower bound from 7281585220995248128 to 7281585220852015104. Current Snapshot: MvccSnapshot[applied={T|T < 7281585220995248128}]
I20260502 14:06:06.851230 13453 consensus_queue.cc:1048] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Connected to new peer: Peer: permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20260502 14:06:06.863493 13280 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
W20260502 14:06:06.968916 13472 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:06.969177 13472 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:06.969293 13472 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260502 14:06:06.969388 13472 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:06.970541 13149 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
W20260502 14:06:06.972218 13472 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:06.972314 13472 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.68
W20260502 14:06:06.972857 13411 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20260502 14:06:06.974653 13472 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:36513
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.68:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.12.158.68
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.12.158.126:43627
--never_fsync=true
--heap_profile_path=/tmp/kudu.13472
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.68
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:06.975082 13472 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:06.975394 13472 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:06.976307 13472 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:06.978413 13554 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:06.978956 13552 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:06.979166 13551 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:06.980415 13472 server_base.cc:1061] running on GCE node
I20260502 14:06:06.980695 13472 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:06.980911 13472 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:06.984169 13472 hybrid_clock.cc:648] HybridClock initialized: now 1777730766984138 us; error 37 us; skew 500 ppm
I20260502 14:06:06.985556 13472 webserver.cc:492] Webserver started at http://127.12.158.68:39209/ using document root <none> and password file <none>
I20260502 14:06:06.985774 13472 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:06.985834 13472 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:06.985937 13472 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260502 14:06:06.987004 13472 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/data/instance:
uuid: "3d450e4b5d234cc08a659a2375cf9753"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.987363 13472 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/wal/instance:
uuid: "3d450e4b5d234cc08a659a2375cf9753"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.990516 13472 fs_manager.cc:696] Time spent creating directory manager: real 0.003s user 0.000s sys 0.002s
I20260502 14:06:06.993244 13560 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:06.993405 13472 fs_manager.cc:730] Time spent opening block manager: real 0.002s user 0.000s sys 0.000s
I20260502 14:06:06.993592 13472 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/wal
uuid: "3d450e4b5d234cc08a659a2375cf9753"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:06.993661 13472 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:07.016219 13472 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:07.016515 13472 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:07.016641 13472 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:07.016852 13472 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:07.017200 13472 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:07.017244 13472 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:07.017277 13472 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:07.017300 13472 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:07.024644 13472 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.68:35609
I20260502 14:06:07.025009 13472 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
I20260502 14:06:07.027109 13673 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.68:35609 every 8 connection(s)
I20260502 14:06:07.032788 13674 heartbeater.cc:344] Connected to a master server at 127.12.158.126:43627
I20260502 14:06:07.032869 13674 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:07.033087 13674 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:07.033434 12956 ts_manager.cc:194] Registered new tserver with Master: 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68:35609)
I20260502 14:06:07.033982 12956 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.68:35005
I20260502 14:06:07.034219 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 13472
I20260502 14:06:07.034283 12921 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-3/wal/instance
I20260502 14:06:07.135537 12956 ts_manager.cc:295] Set tserver state for d6662a7688c84665b55623d3f97e4ae3 to MAINTENANCE_MODE
I20260502 14:06:07.136080 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 13020
W20260502 14:06:07.145215 13300 connection.cc:570] client connection to 127.12.158.65:44257 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260502 14:06:07.145236 13421 connection.cc:570] client connection to 127.12.158.65:44257 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260502 14:06:07.145315 13300 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260502 14:06:07.145337 13421 meta_cache.cc:302] tablet d4c02bd85161406f918b1aee009c7294: replica d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257) has failed: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260502 14:06:07.145563 13169 connection.cc:570] client connection to 127.12.158.65:44257 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260502 14:06:07.145610 13299 connection.cc:570] server connection from 127.12.158.65:40181 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260502 14:06:07.145665 13169 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260502 14:06:07.145577 13167 connection.cc:570] server connection from 127.12.158.65:48323 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260502 14:06:07.145797 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:07.146101 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:07.146142 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:07.146157 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:07.146804 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.147863 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.147962 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.151103 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.151103 13190 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.151103 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.151103 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.152609 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.152719 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.154264 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.154469 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.154862 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.156306 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.165122 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.166287 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.166314 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.166706 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.170841 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.172293 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.173656 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.173789 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.175804 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.175906 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.180469 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.189580 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.190662 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.191814 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.193871 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.196950 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.201407 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.202420 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.203528 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.210152 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.225147 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.226254 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.226264 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.230459 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.236585 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.238960 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.240440 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.240779 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.241983 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.250540 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.252532 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.266355 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.272293 13322 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.272294 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.272588 13321 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.285826 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.287151 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.287892 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.289233 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.290326 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.298970 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.301622 13321 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.304710 13321 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.324013 13321 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.324555 13321 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.327170 13321 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.339620 13321 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.339704 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.342284 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.344282 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.344653 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.347146 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.360317 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.384878 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.385943 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.387005 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.388108 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.392258 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.406703 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.410393 13186 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.410394 13190 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.415443 13190 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
W20260502 14:06:07.431675 13190 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45396: Illegal state: replica 44217426b1314bc9824ba6f1661fb95b is not leader of this config: current role FOLLOWER
I20260502 14:06:07.436547 13518 raft_consensus.cc:493] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Starting pre-election (detected failure of leader d6662a7688c84665b55623d3f97e4ae3)
I20260502 14:06:07.436635 13518 raft_consensus.cc:515] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:07.436846 13518 leader_election.cc:290] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:07.437119 13364 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "7e3a9b987a4e4c74b1f1af634b2f81ed" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 2 candidate_status { last_received { term: 1 index: 112 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
W20260502 14:06:07.437426 13169 leader_election.cc:336] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111)
I20260502 14:06:07.437495 13169 leader_election.cc:304] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b; no voters: a1ddd8ec7c554a1eadeb2aa58f7ab430, d6662a7688c84665b55623d3f97e4ae3
I20260502 14:06:07.437569 13518 raft_consensus.cc:2749] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260502 14:06:07.437856 13537 raft_consensus.cc:493] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader d6662a7688c84665b55623d3f97e4ae3)
I20260502 14:06:07.437906 13537 raft_consensus.cc:515] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:07.438071 13537 leader_election.cc:290] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299), d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257)
I20260502 14:06:07.438128 13518 raft_consensus.cc:493] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Starting pre-election (detected failure of leader d6662a7688c84665b55623d3f97e4ae3)
I20260502 14:06:07.438186 13518 raft_consensus.cc:515] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:07.438366 13518 leader_election.cc:290] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939), d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257)
I20260502 14:06:07.438377 13232 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d4c02bd85161406f918b1aee009c7294" candidate_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" candidate_term: 2 candidate_status { last_received { term: 1 index: 111 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:07.438486 13232 raft_consensus.cc:2410] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate a1ddd8ec7c554a1eadeb2aa58f7ab430 for term 2 because replica has last-logged OpId of term: 1 index: 112, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 111.
W20260502 14:06:07.438512 13300 leader_election.cc:336] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111)
I20260502 14:06:07.438532 13364 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d4c02bd85161406f918b1aee009c7294" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 2 candidate_status { last_received { term: 1 index: 112 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:07.438650 13299 leader_election.cc:304] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: a1ddd8ec7c554a1eadeb2aa58f7ab430; no voters: 44217426b1314bc9824ba6f1661fb95b, d6662a7688c84665b55623d3f97e4ae3
I20260502 14:06:07.438686 13364 raft_consensus.cc:2468] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 44217426b1314bc9824ba6f1661fb95b in term 1.
I20260502 14:06:07.438759 13537 raft_consensus.cc:2749] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260502 14:06:07.438812 13166 leader_election.cc:304] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, a1ddd8ec7c554a1eadeb2aa58f7ab430; no voters:
I20260502 14:06:07.438886 13518 raft_consensus.cc:2804] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Leader pre-election won for term 2
I20260502 14:06:07.438920 13518 raft_consensus.cc:493] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Starting leader election (detected failure of leader d6662a7688c84665b55623d3f97e4ae3)
I20260502 14:06:07.438949 13518 raft_consensus.cc:3060] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Advancing to term 2
W20260502 14:06:07.439092 13169 leader_election.cc:336] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111)
I20260502 14:06:07.439692 13518 raft_consensus.cc:515] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:07.439870 13518 leader_election.cc:290] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 2 election: Requested vote from peers a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939), d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257)
I20260502 14:06:07.439883 13537 raft_consensus.cc:493] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader d6662a7688c84665b55623d3f97e4ae3)
I20260502 14:06:07.439951 13537 raft_consensus.cc:515] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:07.440092 13537 leader_election.cc:290] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:07.440238 13232 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "7e3a9b987a4e4c74b1f1af634b2f81ed" candidate_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" candidate_term: 2 candidate_status { last_received { term: 1 index: 112 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:07.440327 13232 raft_consensus.cc:2468] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a1ddd8ec7c554a1eadeb2aa58f7ab430 in term 1.
I20260502 14:06:07.440403 13364 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d4c02bd85161406f918b1aee009c7294" candidate_uuid: "44217426b1314bc9824ba6f1661fb95b" candidate_term: 2 candidate_status { last_received { term: 1 index: 112 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430"
I20260502 14:06:07.440466 13364 raft_consensus.cc:3060] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Advancing to term 2
I20260502 14:06:07.440527 13299 leader_election.cc:304] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, a1ddd8ec7c554a1eadeb2aa58f7ab430; no voters:
I20260502 14:06:07.440613 13537 raft_consensus.cc:2804] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20260502 14:06:07.440656 13537 raft_consensus.cc:493] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Starting leader election (detected failure of leader d6662a7688c84665b55623d3f97e4ae3)
I20260502 14:06:07.440677 13537 raft_consensus.cc:3060] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Advancing to term 2
I20260502 14:06:07.441145 13364 raft_consensus.cc:2468] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 44217426b1314bc9824ba6f1661fb95b in term 2.
I20260502 14:06:07.441335 13166 leader_election.cc:304] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, a1ddd8ec7c554a1eadeb2aa58f7ab430; no voters:
I20260502 14:06:07.441365 13537 raft_consensus.cc:515] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:07.441471 13518 raft_consensus.cc:2804] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 2 FOLLOWER]: Leader election won for term 2
I20260502 14:06:07.441493 13537 leader_election.cc:290] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 2 election: Requested vote from peers d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257), 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:07.441525 13518 raft_consensus.cc:697] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 2 LEADER]: Becoming Leader. State: Replica: 44217426b1314bc9824ba6f1661fb95b, State: Running, Role: LEADER
I20260502 14:06:07.441623 13232 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "7e3a9b987a4e4c74b1f1af634b2f81ed" candidate_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" candidate_term: 2 candidate_status { last_received { term: 1 index: 112 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b"
I20260502 14:06:07.441571 13518 consensus_queue.cc:237] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 111, Committed index: 111, Last appended: 1.112, Last appended by leader: 112, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:07.441668 13232 raft_consensus.cc:3060] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Advancing to term 2
W20260502 14:06:07.441742 13300 leader_election.cc:336] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111)
W20260502 14:06:07.441979 13300 leader_election.cc:336] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111)
W20260502 14:06:07.442019 13169 leader_election.cc:336] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111)
I20260502 14:06:07.442363 13232 raft_consensus.cc:2468] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a1ddd8ec7c554a1eadeb2aa58f7ab430 in term 2.
I20260502 14:06:07.442317 12958 catalog_manager.cc:5671] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b reported cstate change: term changed from 1 to 2, leader changed from d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65) to 44217426b1314bc9824ba6f1661fb95b (127.12.158.66). New cstate: current_term: 2 leader_uuid: "44217426b1314bc9824ba6f1661fb95b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } health_report { overall_health: UNKNOWN } } }
I20260502 14:06:07.442692 13299 leader_election.cc:304] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 44217426b1314bc9824ba6f1661fb95b, a1ddd8ec7c554a1eadeb2aa58f7ab430; no voters: d6662a7688c84665b55623d3f97e4ae3
I20260502 14:06:07.442787 13537 raft_consensus.cc:2804] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 2 FOLLOWER]: Leader election won for term 2
I20260502 14:06:07.442832 13537 raft_consensus.cc:697] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 2 LEADER]: Becoming Leader. State: Replica: a1ddd8ec7c554a1eadeb2aa58f7ab430, State: Running, Role: LEADER
I20260502 14:06:07.442893 13537 consensus_queue.cc:237] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 111, Committed index: 111, Last appended: 1.112, Last appended by leader: 112, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:07.443331 12958 catalog_manager.cc:5671] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 reported cstate change: term changed from 1 to 2, leader changed from d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65) to a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67). New cstate: current_term: 2 leader_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } health_report { overall_health: HEALTHY } } }
W20260502 14:06:07.458029 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
I20260502 14:06:07.459367 13232 raft_consensus.cc:1275] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 2 FOLLOWER]: Refusing update from remote peer a1ddd8ec7c554a1eadeb2aa58f7ab430: Log matching property violated. Preceding OpId in replica: term: 1 index: 112. Preceding OpId from leader: term: 2 index: 114. (index mismatch)
W20260502 14:06:07.459527 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260502 14:06:07.459697 13535 consensus_queue.cc:1048] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Connected to new peer: Peer: permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 113, Last known committed idx: 111, Time since last communication: 0.000s
W20260502 14:06:07.460286 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
W20260502 14:06:07.464498 13323 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50454: Illegal state: replica a1ddd8ec7c554a1eadeb2aa58f7ab430 is not leader of this config: current role FOLLOWER
I20260502 14:06:07.482661 13364 raft_consensus.cc:1275] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 2 FOLLOWER]: Refusing update from remote peer 44217426b1314bc9824ba6f1661fb95b: Log matching property violated. Preceding OpId in replica: term: 1 index: 111. Preceding OpId from leader: term: 2 index: 114. (index mismatch)
W20260502 14:06:07.482906 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260502 14:06:07.482985 13518 consensus_queue.cc:1048] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Connected to new peer: Peer: permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 113, Last known committed idx: 108, Time since last communication: 0.000s
W20260502 14:06:07.572799 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260502 14:06:07.619009 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260502 14:06:07.637995 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260502 14:06:07.642532 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260502 14:06:07.956712 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260502 14:06:07.992949 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
I20260502 14:06:08.035813 13674 heartbeater.cc:499] Master 127.12.158.126:43627 was elected leader, sending a full tablet report...
W20260502 14:06:08.097101 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:08.118257 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:08.163479 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:08.165629 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:08.421248 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:08.504361 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:08.613605 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:08.616359 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:08.636145 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:08.657639 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:08.986357 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:09.022852 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:09.114501 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260502 14:06:09.124085 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260502 14:06:09.159857 13684 consensus_queue.cc:579] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.029s)
W20260502 14:06:09.162763 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260502 14:06:09.194870 13683 consensus_queue.cc:579] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.064s)
W20260502 14:06:09.197945 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260502 14:06:09.201246 13516 consensus_queue.cc:579] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.067s)
I20260502 14:06:09.236363 13691 consensus_queue.cc:579] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.103s)
I20260502 14:06:09.473737 13681 consensus_queue.cc:579] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.031s)
W20260502 14:06:09.479274 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260502 14:06:09.531520 13541 consensus_queue.cc:579] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.090s)
W20260502 14:06:09.534660 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260502 14:06:09.620469 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:09.654418 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:09.657991 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:09.716434 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:09.993214 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:10.047726 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:10.139876 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260502 14:06:10.160600 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260502 14:06:10.162343 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260502 14:06:10.172060 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 12929
I20260502 14:06:10.181799 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.12.158.126:43627
--webserver_interface=127.12.158.126
--webserver_port=35229
--builtin_ntp_servers=127.12.158.84:36513
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.12.158.126:43627 with env {}
W20260502 14:06:10.267072 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260502 14:06:10.331264 13698 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:10.331555 13698 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:10.331652 13698 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:10.333968 13698 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260502 14:06:10.334215 13698 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:10.334287 13698 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260502 14:06:10.334326 13698 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260502 14:06:10.336441 13698 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:36513
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.12.158.126:43627
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.12.158.126:43627
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.12.158.126
--webserver_port=35229
--never_fsync=true
--heap_profile_path=/tmp/kudu.13698
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:10.336648 13698 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:10.336941 13698 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260502 14:06:10.340054 13706 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:10.340214 13704 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:10.340634 13703 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:10.341500 13698 server_base.cc:1061] running on GCE node
I20260502 14:06:10.341773 13698 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:10.342005 13698 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:10.343287 13698 hybrid_clock.cc:648] HybridClock initialized: now 1777730770343264 us; error 31 us; skew 500 ppm
I20260502 14:06:10.344729 13698 webserver.cc:492] Webserver started at http://127.12.158.126:35229/ using document root <none> and password file <none>
I20260502 14:06:10.344919 13698 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:10.344974 13698 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:10.356906 13698 fs_manager.cc:714] Time spent opening directory manager: real 0.011s user 0.000s sys 0.001s
I20260502 14:06:10.358783 13712 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:10.358963 13698 fs_manager.cc:730] Time spent opening block manager: real 0.002s user 0.000s sys 0.001s
I20260502 14:06:10.359018 13698 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal
uuid: "91962b7baa9544389d582013c70a54a1"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:10.359371 13698 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:10.373893 13698 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:10.374246 13698 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:10.374465 13698 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:10.381397 13698 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.126:43627
I20260502 14:06:10.382683 13698 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/master-0/data/info.pb
I20260502 14:06:10.390323 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 13698
I20260502 14:06:10.392127 13765 sys_catalog.cc:263] Verifying existing consensus state
I20260502 14:06:10.392989 13765 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1: Bootstrap starting.
I20260502 14:06:10.395336 13764 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.126:43627 every 8 connection(s)
I20260502 14:06:10.405195 13765 log.cc:826] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:10.408483 13765 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1: Bootstrap replayed 1/1 log segments. Stats: ops{read=15 overwritten=0 applied=15 ignored=0} inserts{seen=11 ignored=0} mutations{seen=14 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260502 14:06:10.408823 13765 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1: Bootstrap complete.
I20260502 14:06:10.411727 13765 raft_consensus.cc:359] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } }
I20260502 14:06:10.413203 13765 raft_consensus.cc:740] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 91962b7baa9544389d582013c70a54a1, State: Initialized, Role: FOLLOWER
I20260502 14:06:10.413414 13765 consensus_queue.cc:260] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 15, Last appended: 1.15, Last appended by leader: 15, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } }
I20260502 14:06:10.413502 13765 raft_consensus.cc:399] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260502 14:06:10.413592 13765 raft_consensus.cc:493] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260502 14:06:10.413714 13765 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 1 FOLLOWER]: Advancing to term 2
I20260502 14:06:10.415443 13765 raft_consensus.cc:515] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } }
I20260502 14:06:10.415556 13765 leader_election.cc:304] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 91962b7baa9544389d582013c70a54a1; no voters:
I20260502 14:06:10.415747 13765 leader_election.cc:290] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [CANDIDATE]: Term 2 election: Requested vote from peers
I20260502 14:06:10.415844 13768 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 2 FOLLOWER]: Leader election won for term 2
I20260502 14:06:10.416064 13768 raft_consensus.cc:697] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [term 2 LEADER]: Becoming Leader. State: Replica: 91962b7baa9544389d582013c70a54a1, State: Running, Role: LEADER
I20260502 14:06:10.416064 13765 sys_catalog.cc:565] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [sys.catalog]: configured and running, proceeding with master startup.
I20260502 14:06:10.417076 13768 consensus_queue.cc:237] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 15, Committed index: 15, Last appended: 1.15, Last appended by leader: 15, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } }
I20260502 14:06:10.417510 13768 sys_catalog.cc:455] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "91962b7baa9544389d582013c70a54a1" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } } }
I20260502 14:06:10.417573 13768 sys_catalog.cc:458] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [sys.catalog]: This master's current role is: LEADER
I20260502 14:06:10.417695 13768 sys_catalog.cc:455] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 91962b7baa9544389d582013c70a54a1. Latest consensus state: current_term: 2 leader_uuid: "91962b7baa9544389d582013c70a54a1" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "91962b7baa9544389d582013c70a54a1" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 43627 } } }
I20260502 14:06:10.417740 13768 sys_catalog.cc:458] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1 [sys.catalog]: This master's current role is: LEADER
I20260502 14:06:10.418901 13777 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260502 14:06:10.419864 13777 catalog_manager.cc:679] Loaded metadata for table test-workload [id=0eed4d37dcb4435eb14316140701d979]
I20260502 14:06:10.420141 13777 tablet_loader.cc:96] loaded metadata for tablet 1fd859f7e3144efcacca4175c53ef5df (table test-workload [id=0eed4d37dcb4435eb14316140701d979])
I20260502 14:06:10.420265 13777 tablet_loader.cc:96] loaded metadata for tablet 7e3a9b987a4e4c74b1f1af634b2f81ed (table test-workload [id=0eed4d37dcb4435eb14316140701d979])
I20260502 14:06:10.420429 13777 tablet_loader.cc:96] loaded metadata for tablet b175587f4e6a42a6b7e4740bae788705 (table test-workload [id=0eed4d37dcb4435eb14316140701d979])
I20260502 14:06:10.420529 13777 tablet_loader.cc:96] loaded metadata for tablet be812ec5b6a24e06bd0e507df248605e (table test-workload [id=0eed4d37dcb4435eb14316140701d979])
I20260502 14:06:10.420631 13777 tablet_loader.cc:96] loaded metadata for tablet bf9cb1779b0345fe868c2b551293512a (table test-workload [id=0eed4d37dcb4435eb14316140701d979])
I20260502 14:06:10.420715 13777 tablet_loader.cc:96] loaded metadata for tablet d4c02bd85161406f918b1aee009c7294 (table test-workload [id=0eed4d37dcb4435eb14316140701d979])
I20260502 14:06:10.420791 13777 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260502 14:06:10.420958 13777 catalog_manager.cc:1269] Loaded cluster ID: d5274adbe7e045b8a0612b77419235d1
I20260502 14:06:10.421032 13777 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260502 14:06:10.422688 13777 catalog_manager.cc:1514] Loading token signing keys...
I20260502 14:06:10.422892 13777 catalog_manager.cc:6055] T 00000000000000000000000000000000 P 91962b7baa9544389d582013c70a54a1: Loaded TSK: 0
I20260502 14:06:10.423345 13777 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260502 14:06:10.480547 13727 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" instance_seqno: 1777730766720485) as {username='slave'} at 127.12.158.67:37585; Asking this server to re-register.
I20260502 14:06:10.481155 13410 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:10.481360 13410 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:10.482108 13727 ts_manager.cc:194] Registered new tserver with Master: a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
W20260502 14:06:10.484663 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260502 14:06:10.540237 13727 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" instance_seqno: 1777730766610487) as {username='slave'} at 127.12.158.66:59323; Asking this server to re-register.
I20260502 14:06:10.540623 13279 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:10.540723 13279 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:10.544458 13727 ts_manager.cc:194] Registered new tserver with Master: 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:10.578441 13518 consensus_queue.cc:799] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 2 ops behind the committed index
I20260502 14:06:10.581358 13694 consensus_queue.cc:799] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 1 ops behind the committed index
I20260502 14:06:10.587800 13535 consensus_queue.cc:799] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 1 ops behind the committed index
W20260502 14:06:10.588274 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260502 14:06:10.616130 13692 consensus_queue.cc:799] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 14 ops behind the committed index
I20260502 14:06:10.631680 13692 consensus_queue.cc:799] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 10 ops behind the committed index
W20260502 14:06:10.638998 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20260502 14:06:10.666927 13515 consensus_queue.cc:799] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 28 ops behind the committed index
W20260502 14:06:10.676191 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260502 14:06:10.708848 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260502 14:06:10.791864 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260502 14:06:11.012413 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20260502 14:06:11.044658 13727 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" instance_seqno: 1777730767022351) as {username='slave'} at 127.12.158.68:57929; Asking this server to re-register.
I20260502 14:06:11.044999 13674 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:11.045121 13674 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:11.045522 13727 ts_manager.cc:194] Registered new tserver with Master: 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68:35609)
W20260502 14:06:11.078370 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260502 14:06:11.173447 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260502 14:06:11.197656 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260502 14:06:11.274562 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260502 14:06:11.336143 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260502 14:06:11.550285 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260502 14:06:11.562609 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260502 14:06:11.638484 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260502 14:06:11.752494 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260502 14:06:11.816291 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260502 14:06:11.893980 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260502 14:06:12.124744 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260502 14:06:12.129318 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260502 14:06:12.148960 13169 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111) [suppressed 196 similar messages]
W20260502 14:06:12.149453 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260502 14:06:12.173062 13300 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111) [suppressed 96 similar messages]
W20260502 14:06:12.352134 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260502 14:06:12.355255 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260502 14:06:12.413960 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260502 14:06:12.654844 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260502 14:06:12.663892 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260502 14:06:12.679075 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260502 14:06:12.811338 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260502 14:06:12.861922 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260502 14:06:12.867251 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260502 14:06:13.165087 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260502 14:06:13.184237 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260502 14:06:13.200965 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20260502 14:06:13.317967 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20260502 14:06:13.347867 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20260502 14:06:13.398882 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
I20260502 14:06:13.412724 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.65:44257
--local_ip_for_outbound_sockets=127.12.158.65
--tserver_master_addrs=127.12.158.126:43627
--webserver_port=43375
--webserver_interface=127.12.158.65
--builtin_ntp_servers=127.12.158.84:36513
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20260502 14:06:13.585752 13797 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:13.586066 13797 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:13.586169 13797 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260502 14:06:13.586263 13797 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:13.588824 13797 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:13.588989 13797 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.65
I20260502 14:06:13.591786 13797 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:36513
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.65:44257
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.12.158.65
--webserver_port=43375
--enable_log_gc=false
--tserver_master_addrs=127.12.158.126:43627
--never_fsync=true
--heap_profile_path=/tmp/kudu.13797
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.65
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:13.592226 13797 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:13.592568 13797 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:13.593896 13797 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:13.598498 13802 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:13.598280 13805 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:13.599007 13803 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:13.602375 13797 server_base.cc:1061] running on GCE node
I20260502 14:06:13.602560 13797 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:13.602810 13797 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:13.611975 13797 hybrid_clock.cc:648] HybridClock initialized: now 1777730773611941 us; error 40 us; skew 500 ppm
I20260502 14:06:13.613801 13797 webserver.cc:492] Webserver started at http://127.12.158.65:43375/ using document root <none> and password file <none>
I20260502 14:06:13.614080 13797 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:13.614185 13797 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:13.615008 13515 consensus_queue.cc:799] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 1429 ops behind the committed index [suppressed 28 similar messages]
I20260502 14:06:13.617133 13681 consensus_queue.cc:799] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 1424 ops behind the committed index [suppressed 29 similar messages]
I20260502 14:06:13.619894 13797 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.001s sys 0.000s
I20260502 14:06:13.622519 13811 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:13.622750 13797 fs_manager.cc:730] Time spent opening block manager: real 0.002s user 0.001s sys 0.000s
I20260502 14:06:13.622817 13797 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal
uuid: "d6662a7688c84665b55623d3f97e4ae3"
format_stamp: "Formatted at 2026-05-02 14:06:06 on dist-test-slave-23m0"
I20260502 14:06:13.623167 13797 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:13.642553 13797 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:13.642860 13797 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:13.643113 13797 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:13.643426 13797 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:13.644137 13818 ts_tablet_manager.cc:542] Loading tablet metadata (0/6 complete)
I20260502 14:06:13.647408 13797 ts_tablet_manager.cc:585] Loaded tablet metadata (6 total tablets, 6 live tablets)
I20260502 14:06:13.647459 13797 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.003s user 0.000s sys 0.000s
I20260502 14:06:13.647508 13797 ts_tablet_manager.cc:600] Registering tablets (0/6 complete)
I20260502 14:06:13.649935 13797 ts_tablet_manager.cc:616] Registered 6 tablets
I20260502 14:06:13.649972 13797 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s user 0.000s sys 0.000s
I20260502 14:06:13.670612 13797 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.65:44257
I20260502 14:06:13.671062 13797 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
I20260502 14:06:13.671339 13818 tablet_bootstrap.cc:492] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
I20260502 14:06:13.673183 13925 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.65:44257 every 8 connection(s)
I20260502 14:06:13.676224 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 13797
I20260502 14:06:13.689401 13681 consensus_queue.cc:799] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 1454 ops behind the committed index [suppressed 28 similar messages]
I20260502 14:06:13.697578 13926 heartbeater.cc:344] Connected to a master server at 127.12.158.126:43627
I20260502 14:06:13.697696 13926 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:13.698956 13926 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:13.699625 13727 ts_manager.cc:194] Registered new tserver with Master: d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257)
I20260502 14:06:13.700522 13727 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.65:57617
I20260502 14:06:13.703080 13511 consensus_queue.cc:799] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 1463 ops behind the committed index [suppressed 28 similar messages]
I20260502 14:06:13.705857 13515 consensus_queue.cc:799] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 1456 ops behind the committed index [suppressed 29 similar messages]
I20260502 14:06:13.711426 13818 log.cc:826] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3: Log is configured to *not* fsync() on all Append() calls
W20260502 14:06:13.728072 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 61: this message will repeat every 5th retry.
W20260502 14:06:13.732641 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 61: this message will repeat every 5th retry.
W20260502 14:06:13.737577 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 66: this message will repeat every 5th retry.
I20260502 14:06:13.742453 13791 consensus_queue.cc:799] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 1468 ops behind the committed index [suppressed 28 similar messages]
I20260502 14:06:13.760440 13818 tablet_bootstrap.cc:492] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3: Bootstrap replayed 1/1 log segments. Stats: ops{read=112 overwritten=0 applied=111 ignored=0} inserts{seen=892 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:06:13.762549 13818 tablet_bootstrap.cc:492] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3: Bootstrap complete.
I20260502 14:06:13.764535 13818 ts_tablet_manager.cc:1403] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.108s user 0.010s sys 0.016s
I20260502 14:06:13.766501 13818 raft_consensus.cc:359] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:13.767649 13818 raft_consensus.cc:740] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:13.767864 13818 consensus_queue.cc:260] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 111, Last appended: 1.112, Last appended by leader: 112, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:13.768242 13818 ts_tablet_manager.cc:1434] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.003s user 0.001s sys 0.001s
I20260502 14:06:13.768396 13926 heartbeater.cc:499] Master 127.12.158.126:43627 was elected leader, sending a full tablet report...
I20260502 14:06:13.769407 13818 tablet_bootstrap.cc:492] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
I20260502 14:06:13.797722 13818 tablet_bootstrap.cc:492] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3: Bootstrap replayed 1/1 log segments. Stats: ops{read=112 overwritten=0 applied=111 ignored=0} inserts{seen=873 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:06:13.798005 13818 tablet_bootstrap.cc:492] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3: Bootstrap complete.
I20260502 14:06:13.799022 13818 ts_tablet_manager.cc:1403] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.030s user 0.017s sys 0.005s
I20260502 14:06:13.799167 13818 raft_consensus.cc:359] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:13.799284 13818 raft_consensus.cc:740] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:13.799371 13818 consensus_queue.cc:260] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 111, Last appended: 1.112, Last appended by leader: 112, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:13.799485 13818 ts_tablet_manager.cc:1434] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.000s user 0.001s sys 0.000s
I20260502 14:06:13.799603 13818 tablet_bootstrap.cc:492] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
I20260502 14:06:13.822904 13818 tablet_bootstrap.cc:492] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3: Bootstrap replayed 1/1 log segments. Stats: ops{read=111 overwritten=0 applied=111 ignored=0} inserts{seen=935 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260502 14:06:13.823223 13818 tablet_bootstrap.cc:492] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3: Bootstrap complete.
I20260502 14:06:13.825475 13818 ts_tablet_manager.cc:1403] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.026s user 0.012s sys 0.008s
I20260502 14:06:13.837740 13818 raft_consensus.cc:359] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:13.837898 13818 raft_consensus.cc:740] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:13.837983 13818 consensus_queue.cc:260] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 111, Last appended: 1.111, Last appended by leader: 111, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:13.838122 13818 ts_tablet_manager.cc:1434] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.013s user 0.000s sys 0.000s
I20260502 14:06:13.838201 13818 tablet_bootstrap.cc:492] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
W20260502 14:06:13.839994 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 66: this message will repeat every 5th retry.
I20260502 14:06:13.864121 13871 raft_consensus.cc:3060] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Advancing to term 2
I20260502 14:06:13.874053 13879 raft_consensus.cc:1217] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Deduplicated request from leader. Original: 1.111->[1.112-1.1679] Dedup: 1.112->[1.113-1.1679]
I20260502 14:06:13.912168 13818 tablet_bootstrap.cc:492] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Bootstrap replayed 1/1 log segments. Stats: ops{read=113 overwritten=0 applied=111 ignored=0} inserts{seen=946 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260502 14:06:13.912667 13818 tablet_bootstrap.cc:492] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Bootstrap complete.
I20260502 14:06:13.913916 13818 ts_tablet_manager.cc:1403] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.076s user 0.013s sys 0.011s
I20260502 14:06:13.914134 13818 raft_consensus.cc:359] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:13.914428 13818 raft_consensus.cc:740] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:13.914561 13818 consensus_queue.cc:260] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 111, Last appended: 1.113, Last appended by leader: 113, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:13.914773 13818 ts_tablet_manager.cc:1434] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:13.914872 13818 tablet_bootstrap.cc:492] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
W20260502 14:06:13.915858 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 66: this message will repeat every 5th retry.
W20260502 14:06:13.936378 13927 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20260502 14:06:13.961530 13875 raft_consensus.cc:3060] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Advancing to term 2
I20260502 14:06:13.965775 13875 pending_rounds.cc:85] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Aborting all ops after (but not including) 112
I20260502 14:06:13.967854 13875 pending_rounds.cc:107] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3: Aborting uncommitted WRITE_OP operation due to leader change: 1.113
I20260502 14:06:14.047928 13818 tablet_bootstrap.cc:492] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3: Bootstrap replayed 1/1 log segments. Stats: ops{read=111 overwritten=0 applied=111 ignored=0} inserts{seen=899 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260502 14:06:14.050894 13818 tablet_bootstrap.cc:492] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3: Bootstrap complete.
I20260502 14:06:14.055987 13818 ts_tablet_manager.cc:1403] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.141s user 0.016s sys 0.008s
I20260502 14:06:14.056229 13818 raft_consensus.cc:359] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:14.056375 13818 raft_consensus.cc:740] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:14.056493 13818 consensus_queue.cc:260] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 111, Last appended: 1.111, Last appended by leader: 111, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:14.056717 13818 ts_tablet_manager.cc:1434] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:14.056807 13818 tablet_bootstrap.cc:492] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3: Bootstrap starting.
I20260502 14:06:14.135221 13941 mvcc.cc:204] Tried to move back new op lower bound from 7281585236270379008 to 7281585223446380544. Current Snapshot: MvccSnapshot[applied={T|T < 7281585235633561600}]
I20260502 14:06:14.165052 13818 tablet_bootstrap.cc:492] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3: Bootstrap replayed 1/1 log segments. Stats: ops{read=111 overwritten=0 applied=111 ignored=0} inserts{seen=955 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260502 14:06:14.165373 13818 tablet_bootstrap.cc:492] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3: Bootstrap complete.
I20260502 14:06:14.166215 13818 ts_tablet_manager.cc:1403] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3: Time spent bootstrapping tablet: real 0.109s user 0.006s sys 0.012s
I20260502 14:06:14.166381 13818 raft_consensus.cc:359] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:14.166482 13818 raft_consensus.cc:740] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: d6662a7688c84665b55623d3f97e4ae3, State: Initialized, Role: FOLLOWER
I20260502 14:06:14.166545 13818 consensus_queue.cc:260] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 111, Last appended: 1.111, Last appended by leader: 111, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:14.166667 13818 ts_tablet_manager.cc:1434] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:14.269517 13935 raft_consensus.cc:493] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 44217426b1314bc9824ba6f1661fb95b)
I20260502 14:06:14.269655 13935 raft_consensus.cc:515] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:14.270017 13935 leader_election.cc:290] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:14.289971 13231 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b175587f4e6a42a6b7e4740bae788705" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 2 candidate_status { last_received { term: 1 index: 1679 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:14.328564 13952 raft_consensus.cc:493] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 2 FOLLOWER]: Starting pre-election (detected failure of leader a1ddd8ec7c554a1eadeb2aa58f7ab430)
I20260502 14:06:14.328663 13952 raft_consensus.cc:515] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:14.328810 13952 leader_election.cc:290] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:14.329448 13232 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "7e3a9b987a4e4c74b1f1af634b2f81ed" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 3 candidate_status { last_received { term: 2 index: 3202 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:14.347131 13362 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "b175587f4e6a42a6b7e4740bae788705" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 2 candidate_status { last_received { term: 1 index: 1679 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:14.347332 13362 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "7e3a9b987a4e4c74b1f1af634b2f81ed" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 3 candidate_status { last_received { term: 2 index: 3202 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:14.347497 13812 leader_election.cc:304] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: d6662a7688c84665b55623d3f97e4ae3; no voters: 44217426b1314bc9824ba6f1661fb95b, a1ddd8ec7c554a1eadeb2aa58f7ab430
I20260502 14:06:14.347697 13812 leader_election.cc:304] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: d6662a7688c84665b55623d3f97e4ae3; no voters: 44217426b1314bc9824ba6f1661fb95b, a1ddd8ec7c554a1eadeb2aa58f7ab430
I20260502 14:06:14.358450 13935 raft_consensus.cc:493] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 2 FOLLOWER]: Starting pre-election (detected failure of leader 44217426b1314bc9824ba6f1661fb95b)
I20260502 14:06:14.358520 13935 raft_consensus.cc:515] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:14.358680 13935 leader_election.cc:290] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939), 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:14.358734 13935 raft_consensus.cc:2749] T 7e3a9b987a4e4c74b1f1af634b2f81ed P d6662a7688c84665b55623d3f97e4ae3 [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20260502 14:06:14.359156 13231 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d4c02bd85161406f918b1aee009c7294" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 3 candidate_status { last_received { term: 2 index: 1672 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:14.359732 13362 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d4c02bd85161406f918b1aee009c7294" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 3 candidate_status { last_received { term: 2 index: 1672 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:14.360175 13812 leader_election.cc:304] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: d6662a7688c84665b55623d3f97e4ae3; no voters: 44217426b1314bc9824ba6f1661fb95b, a1ddd8ec7c554a1eadeb2aa58f7ab430
I20260502 14:06:14.364765 13959 raft_consensus.cc:2749] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20260502 14:06:14.390884 13935 raft_consensus.cc:493] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader a1ddd8ec7c554a1eadeb2aa58f7ab430)
I20260502 14:06:14.390959 13935 raft_consensus.cc:515] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:14.391134 13935 leader_election.cc:290] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:14.391594 13362 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "bf9cb1779b0345fe868c2b551293512a" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 2 candidate_status { last_received { term: 1 index: 1675 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:14.391785 13232 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "bf9cb1779b0345fe868c2b551293512a" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 2 candidate_status { last_received { term: 1 index: 1675 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:14.392081 13812 leader_election.cc:304] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: d6662a7688c84665b55623d3f97e4ae3; no voters: 44217426b1314bc9824ba6f1661fb95b, a1ddd8ec7c554a1eadeb2aa58f7ab430
I20260502 14:06:14.392974 13935 raft_consensus.cc:2749] T bf9cb1779b0345fe868c2b551293512a P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260502 14:06:14.420626 13952 raft_consensus.cc:2749] T b175587f4e6a42a6b7e4740bae788705 P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260502 14:06:14.459861 13952 raft_consensus.cc:493] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 44217426b1314bc9824ba6f1661fb95b)
I20260502 14:06:14.459941 13952 raft_consensus.cc:515] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:14.460120 13952 leader_election.cc:290] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:14.460420 13360 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "be812ec5b6a24e06bd0e507df248605e" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 2 candidate_status { last_received { term: 1 index: 1673 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:14.460466 13232 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "be812ec5b6a24e06bd0e507df248605e" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 2 candidate_status { last_received { term: 1 index: 1673 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:14.460883 13812 leader_election.cc:304] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: d6662a7688c84665b55623d3f97e4ae3; no voters: 44217426b1314bc9824ba6f1661fb95b, a1ddd8ec7c554a1eadeb2aa58f7ab430
I20260502 14:06:14.460999 13952 raft_consensus.cc:2749] T be812ec5b6a24e06bd0e507df248605e P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260502 14:06:14.549921 13952 raft_consensus.cc:493] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 44217426b1314bc9824ba6f1661fb95b)
I20260502 14:06:14.550002 13952 raft_consensus.cc:515] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } }
I20260502 14:06:14.550129 13952 leader_election.cc:290] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299), a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:14.550606 13360 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1fd859f7e3144efcacca4175c53ef5df" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 2 candidate_status { last_received { term: 1 index: 1681 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:14.552783 13232 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "1fd859f7e3144efcacca4175c53ef5df" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 2 candidate_status { last_received { term: 1 index: 1681 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:14.555814 13814 leader_election.cc:304] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: d6662a7688c84665b55623d3f97e4ae3; no voters: 44217426b1314bc9824ba6f1661fb95b, a1ddd8ec7c554a1eadeb2aa58f7ab430
I20260502 14:06:14.828655 13952 raft_consensus.cc:2749] T 1fd859f7e3144efcacca4175c53ef5df P d6662a7688c84665b55623d3f97e4ae3 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260502 14:06:14.835983 13935 raft_consensus.cc:493] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 2 FOLLOWER]: Starting pre-election (detected failure of leader 44217426b1314bc9824ba6f1661fb95b)
I20260502 14:06:14.836057 13935 raft_consensus.cc:515] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } }
I20260502 14:06:14.836247 13935 leader_election.cc:290] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939), 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:14.836547 13363 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d4c02bd85161406f918b1aee009c7294" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 3 candidate_status { last_received { term: 2 index: 3220 } } ignore_live_leader: false dest_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" is_pre_election: true
I20260502 14:06:14.837523 13232 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d4c02bd85161406f918b1aee009c7294" candidate_uuid: "d6662a7688c84665b55623d3f97e4ae3" candidate_term: 3 candidate_status { last_received { term: 2 index: 3220 } } ignore_live_leader: false dest_uuid: "44217426b1314bc9824ba6f1661fb95b" is_pre_election: true
I20260502 14:06:14.837770 13814 leader_election.cc:304] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: d6662a7688c84665b55623d3f97e4ae3; no voters: 44217426b1314bc9824ba6f1661fb95b, a1ddd8ec7c554a1eadeb2aa58f7ab430
I20260502 14:06:14.837865 13935 raft_consensus.cc:2749] T d4c02bd85161406f918b1aee009c7294 P d6662a7688c84665b55623d3f97e4ae3 [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20260502 14:06:15.021209 13727 ts_manager.cc:284] Unset tserver state for d6662a7688c84665b55623d3f97e4ae3 from MAINTENANCE_MODE
I20260502 14:06:15.048372 13674 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:15.408345 13410 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:15.553491 13279 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:15.838101 13926 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:18.060256 13727 ts_manager.cc:295] Set tserver state for d6662a7688c84665b55623d3f97e4ae3 to MAINTENANCE_MODE
I20260502 14:06:18.060695 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 13797
W20260502 14:06:18.088371 13169 connection.cc:570] client connection to 127.12.158.65:44257 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260502 14:06:18.088464 13169 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107) [suppressed 57 similar messages]
W20260502 14:06:18.089270 13300 connection.cc:570] client connection to 127.12.158.65:44257 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260502 14:06:18.089803 13300 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107) [suppressed 29 similar messages]
W20260502 14:06:18.089921 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:18.089975 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:18.090003 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:18.090025 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:18.090240 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:18.090461 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:18.520285 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260502 14:06:18.557874 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260502 14:06:18.577728 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260502 14:06:18.582362 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260502 14:06:18.588486 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260502 14:06:18.602545 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260502 14:06:19.035537 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:19.050742 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:19.084393 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:19.088836 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:19.104538 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:19.120918 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260502 14:06:19.545810 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:19.557157 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:19.570374 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:19.572504 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:19.612071 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:19.617776 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260502 14:06:20.020269 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260502 14:06:20.071961 13974 consensus_queue.cc:579] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.014s)
I20260502 14:06:20.083956 13981 consensus_queue.cc:579] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.025s)
W20260502 14:06:20.084054 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260502 14:06:20.087265 13791 consensus_queue.cc:579] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.027s)
W20260502 14:06:20.089774 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260502 14:06:20.094275 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260502 14:06:20.109805 13981 consensus_queue.cc:579] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.051s)
I20260502 14:06:20.123592 13994 consensus_queue.cc:579] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.063s)
W20260502 14:06:20.126940 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260502 14:06:20.159466 13968 consensus_queue.cc:579] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Leader has been unable to successfully communicate with peer d6662a7688c84665b55623d3f97e4ae3 for more than 2 seconds (2.101s)
W20260502 14:06:20.162570 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260502 14:06:20.266829 13720 ts_manager.cc:284] Unset tserver state for d6662a7688c84665b55623d3f97e4ae3 from MAINTENANCE_MODE
W20260502 14:06:20.521101 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:20.584051 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:20.597826 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:20.600111 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:20.628583 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:20.636963 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260502 14:06:21.027334 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260502 14:06:21.059002 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260502 14:06:21.062722 13674 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
W20260502 14:06:21.063203 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260502 14:06:21.110031 13514 consensus_queue.cc:799] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 2 ops behind the committed index [suppressed 6 similar messages]
W20260502 14:06:21.115813 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260502 14:06:21.126645 13279 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:21.137276 13681 consensus_queue.cc:799] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 21 ops behind the committed index [suppressed 4 similar messages]
I20260502 14:06:21.140430 14004 consensus_queue.cc:799] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 16 ops behind the committed index [suppressed 4 similar messages]
W20260502 14:06:21.140789 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260502 14:06:21.146021 13233 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 5 messages since previous log ~7 seconds ago
I20260502 14:06:21.146165 13233 consensus_queue.cc:237] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6193, Committed index: 6193, Last appended: 1.6193, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6194 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } }
I20260502 14:06:21.146153 13230 consensus_queue.cc:237] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6189, Committed index: 6189, Last appended: 2.6193, Last appended by leader: 112, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6194 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } }
I20260502 14:06:21.146153 13232 consensus_queue.cc:237] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6192, Committed index: 6192, Last appended: 1.6192, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6193 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } }
I20260502 14:06:21.146960 13363 raft_consensus.cc:1275] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 2 FOLLOWER]: Refusing update from remote peer 44217426b1314bc9824ba6f1661fb95b: Log matching property violated. Preceding OpId in replica: term: 2 index: 6189. Preceding OpId from leader: term: 2 index: 6194. (index mismatch)
I20260502 14:06:21.147220 13363 raft_consensus.cc:1275] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Refusing update from remote peer 44217426b1314bc9824ba6f1661fb95b: Log matching property violated. Preceding OpId in replica: term: 1 index: 6193. Preceding OpId from leader: term: 1 index: 6194. (index mismatch)
W20260502 14:06:21.147615 13169 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:21.147658 13169 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:21.148350 13169 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260502 14:06:21.148826 14004 consensus_queue.cc:799] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 25 ops behind the committed index [suppressed 2 similar messages]
I20260502 14:06:21.149219 13231 consensus_queue.cc:237] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6192, Committed index: 6192, Last appended: 1.6192, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6193 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } }
W20260502 14:06:21.149940 13169 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260502 14:06:21.151616 14004 consensus_queue.cc:1048] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Connected to new peer: Peer: permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6194, Last known committed idx: 6189, Time since last communication: 0.000s
I20260502 14:06:21.151649 13994 consensus_queue.cc:1048] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Connected to new peer: Peer: permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6194, Last known committed idx: 6193, Time since last communication: 0.000s
I20260502 14:06:21.151921 13363 raft_consensus.cc:1275] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Refusing update from remote peer 44217426b1314bc9824ba6f1661fb95b: Log matching property violated. Preceding OpId in replica: term: 1 index: 6192. Preceding OpId from leader: term: 1 index: 6193. (index mismatch)
I20260502 14:06:21.152123 13363 raft_consensus.cc:1275] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Refusing update from remote peer 44217426b1314bc9824ba6f1661fb95b: Log matching property violated. Preceding OpId in replica: term: 1 index: 6192. Preceding OpId from leader: term: 1 index: 6193. (index mismatch)
W20260502 14:06:21.152136 13166 consensus_peers.cc:597] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b -> Peer 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68:35609): Couldn't send request to peer 3d450e4b5d234cc08a659a2375cf9753. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: be812ec5b6a24e06bd0e507df248605e. This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:21.152180 13166 consensus_peers.cc:597] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b -> Peer 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68:35609): Couldn't send request to peer 3d450e4b5d234cc08a659a2375cf9753. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: d4c02bd85161406f918b1aee009c7294. This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:21.152213 13166 consensus_peers.cc:597] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b -> Peer 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68:35609): Couldn't send request to peer 3d450e4b5d234cc08a659a2375cf9753. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 1fd859f7e3144efcacca4175c53ef5df. This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:21.152278 13166 consensus_peers.cc:597] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b -> Peer 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68:35609): Couldn't send request to peer 3d450e4b5d234cc08a659a2375cf9753. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: b175587f4e6a42a6b7e4740bae788705. This is attempt 1: this message will repeat every 5th retry.
I20260502 14:06:21.152415 14004 consensus_queue.cc:1048] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Connected to new peer: Peer: permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6193, Last known committed idx: 6192, Time since last communication: 0.000s
I20260502 14:06:21.152436 13994 consensus_queue.cc:1048] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [LEADER]: Connected to new peer: Peer: permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6193, Last known committed idx: 6192, Time since last communication: 0.000s
I20260502 14:06:21.152871 14004 raft_consensus.cc:2955] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b [term 1 LEADER]: Committing config change with OpId 1.6194: config changed from index -1 to 6194, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6194 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.153810 13715 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet be812ec5b6a24e06bd0e507df248605e with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260502 14:06:21.154013 14003 raft_consensus.cc:2955] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b [term 1 LEADER]: Committing config change with OpId 1.6193: config changed from index -1 to 6193, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6193 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.154040 13791 raft_consensus.cc:2955] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b [term 2 LEADER]: Committing config change with OpId 2.6194: config changed from index -1 to 6194, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6194 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.154064 13720 catalog_manager.cc:5671] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b reported cstate change: config changed from index -1 to 6194, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New cstate: current_term: 1 leader_uuid: "44217426b1314bc9824ba6f1661fb95b" committed_config { opid_index: 6194 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20260502 14:06:21.154145 13360 raft_consensus.cc:2955] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 2 FOLLOWER]: Committing config change with OpId 2.6194: config changed from index -1 to 6194, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6194 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.155143 13363 raft_consensus.cc:2955] T be812ec5b6a24e06bd0e507df248605e P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Committing config change with OpId 1.6194: config changed from index -1 to 6194, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6194 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.155364 13360 raft_consensus.cc:2955] T 1fd859f7e3144efcacca4175c53ef5df P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Committing config change with OpId 1.6193: config changed from index -1 to 6193, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6193 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.155557 13715 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 1fd859f7e3144efcacca4175c53ef5df with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260502 14:06:21.155541 13727 catalog_manager.cc:5671] T d4c02bd85161406f918b1aee009c7294 P a1ddd8ec7c554a1eadeb2aa58f7ab430 reported cstate change: config changed from index -1 to 6194, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New cstate: current_term: 2 leader_uuid: "44217426b1314bc9824ba6f1661fb95b" committed_config { opid_index: 6194 OBSOLETE_local: false peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.155648 13715 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet d4c02bd85161406f918b1aee009c7294 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260502 14:06:21.155788 13978 raft_consensus.cc:2955] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b [term 1 LEADER]: Committing config change with OpId 1.6193: config changed from index -1 to 6193, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6193 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.156714 13715 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet b175587f4e6a42a6b7e4740bae788705 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260502 14:06:21.156977 13360 raft_consensus.cc:2955] T b175587f4e6a42a6b7e4740bae788705 P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 FOLLOWER]: Committing config change with OpId 1.6193: config changed from index -1 to 6193, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6193 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.161478 13410 heartbeater.cc:507] Master 127.12.158.126:43627 requested a full tablet report, sending...
I20260502 14:06:21.161499 13727 catalog_manager.cc:5671] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b reported cstate change: config changed from index -1 to 6193, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New cstate: current_term: 1 leader_uuid: "44217426b1314bc9824ba6f1661fb95b" committed_config { opid_index: 6193 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20260502 14:06:21.161589 13727 catalog_manager.cc:5671] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b reported cstate change: config changed from index -1 to 6193, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New cstate: current_term: 1 leader_uuid: "44217426b1314bc9824ba6f1661fb95b" committed_config { opid_index: 6193 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20260502 14:06:21.165499 13980 consensus_queue.cc:799] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Peer d6662a7688c84665b55623d3f97e4ae3 is lagging by at least 27 ops behind the committed index [suppressed 1 similar messages]
I20260502 14:06:21.167719 13361 consensus_queue.cc:237] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6195, Committed index: 6195, Last appended: 1.6196, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6197 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } }
I20260502 14:06:21.167995 13362 consensus_queue.cc:237] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6197, Committed index: 6197, Last appended: 2.6198, Last appended by leader: 112, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6199 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } }
W20260502 14:06:21.168706 13300 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:21.168756 13300 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257): Couldn't send request to peer d6662a7688c84665b55623d3f97e4ae3. Status: Network error: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260502 14:06:21.170148 13232 raft_consensus.cc:1275] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 2 FOLLOWER]: Refusing update from remote peer a1ddd8ec7c554a1eadeb2aa58f7ab430: Log matching property violated. Preceding OpId in replica: term: 2 index: 6198. Preceding OpId from leader: term: 2 index: 6199. (index mismatch)
I20260502 14:06:21.170233 13231 raft_consensus.cc:1275] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Refusing update from remote peer a1ddd8ec7c554a1eadeb2aa58f7ab430: Log matching property violated. Preceding OpId in replica: term: 1 index: 6196. Preceding OpId from leader: term: 1 index: 6197. (index mismatch)
I20260502 14:06:21.170374 13997 consensus_queue.cc:1048] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Connected to new peer: Peer: permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6199, Last known committed idx: 6197, Time since last communication: 0.000s
I20260502 14:06:21.170392 13996 consensus_queue.cc:1048] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [LEADER]: Connected to new peer: Peer: permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6197, Last known committed idx: 6195, Time since last communication: 0.000s
I20260502 14:06:21.171096 13681 raft_consensus.cc:2955] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 2 LEADER]: Committing config change with OpId 2.6199: config changed from index -1 to 6199, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6199 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.171270 13996 raft_consensus.cc:2955] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 [term 1 LEADER]: Committing config change with OpId 1.6197: config changed from index -1 to 6197, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6197 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.172186 13232 raft_consensus.cc:2955] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b [term 2 FOLLOWER]: Committing config change with OpId 2.6199: config changed from index -1 to 6199, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6199 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.172278 13231 raft_consensus.cc:2955] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b [term 1 FOLLOWER]: Committing config change with OpId 1.6197: config changed from index -1 to 6197, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New config: { opid_index: 6197 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.172348 13713 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 7e3a9b987a4e4c74b1f1af634b2f81ed with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260502 14:06:21.172484 13713 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet bf9cb1779b0345fe868c2b551293512a with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260502 14:06:21.173496 13720 catalog_manager.cc:5671] T bf9cb1779b0345fe868c2b551293512a P 44217426b1314bc9824ba6f1661fb95b reported cstate change: config changed from index -1 to 6197, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New cstate: current_term: 1 leader_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" committed_config { opid_index: 6197 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
I20260502 14:06:21.173612 13720 catalog_manager.cc:5671] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 44217426b1314bc9824ba6f1661fb95b reported cstate change: config changed from index -1 to 6199, NON_VOTER 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68) added. New cstate: current_term: 2 leader_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" committed_config { opid_index: 6199 OBSOLETE_local: false peers { permanent_uuid: "d6662a7688c84665b55623d3f97e4ae3" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 44257 } } peers { permanent_uuid: "44217426b1314bc9824ba6f1661fb95b" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 33299 } } peers { permanent_uuid: "a1ddd8ec7c554a1eadeb2aa58f7ab430" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 33939 } } peers { permanent_uuid: "3d450e4b5d234cc08a659a2375cf9753" member_type: NON_VOTER last_known_addr { host: "127.12.158.68" port: 35609 } attrs { promote: true } } }
W20260502 14:06:21.178417 13297 consensus_peers.cc:597] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68:35609): Couldn't send request to peer 3d450e4b5d234cc08a659a2375cf9753. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: bf9cb1779b0345fe868c2b551293512a. This is attempt 1: this message will repeat every 5th retry.
W20260502 14:06:21.178508 13297 consensus_peers.cc:597] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430 -> Peer 3d450e4b5d234cc08a659a2375cf9753 (127.12.158.68:35609): Couldn't send request to peer 3d450e4b5d234cc08a659a2375cf9753. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 7e3a9b987a4e4c74b1f1af634b2f81ed. This is attempt 1: this message will repeat every 5th retry.
I20260502 14:06:21.239912 14015 ts_tablet_manager.cc:933] T be812ec5b6a24e06bd0e507df248605e P 3d450e4b5d234cc08a659a2375cf9753: Initiating tablet copy from peer 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:21.241518 14015 tablet_copy_client.cc:323] T be812ec5b6a24e06bd0e507df248605e P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Beginning tablet copy session from remote peer at address 127.12.158.66:33299
I20260502 14:06:21.245877 14018 ts_tablet_manager.cc:933] T bf9cb1779b0345fe868c2b551293512a P 3d450e4b5d234cc08a659a2375cf9753: Initiating tablet copy from peer a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:21.247442 14018 tablet_copy_client.cc:323] T bf9cb1779b0345fe868c2b551293512a P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Beginning tablet copy session from remote peer at address 127.12.158.67:33939
I20260502 14:06:21.248526 14020 ts_tablet_manager.cc:933] T b175587f4e6a42a6b7e4740bae788705 P 3d450e4b5d234cc08a659a2375cf9753: Initiating tablet copy from peer 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:21.248770 14020 tablet_copy_client.cc:323] T b175587f4e6a42a6b7e4740bae788705 P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Beginning tablet copy session from remote peer at address 127.12.158.66:33299
I20260502 14:06:21.258939 14023 ts_tablet_manager.cc:933] T 1fd859f7e3144efcacca4175c53ef5df P 3d450e4b5d234cc08a659a2375cf9753: Initiating tablet copy from peer 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:21.261250 13384 tablet_copy_service.cc:140] P a1ddd8ec7c554a1eadeb2aa58f7ab430: Received BeginTabletCopySession request for tablet bf9cb1779b0345fe868c2b551293512a from peer 3d450e4b5d234cc08a659a2375cf9753 ({username='slave'} at 127.12.158.68:57109)
I20260502 14:06:21.261323 13384 tablet_copy_service.cc:161] P a1ddd8ec7c554a1eadeb2aa58f7ab430: Beginning new tablet copy session on tablet bf9cb1779b0345fe868c2b551293512a from peer 3d450e4b5d234cc08a659a2375cf9753 at {username='slave'} at 127.12.158.68:57109: session id = 3d450e4b5d234cc08a659a2375cf9753-bf9cb1779b0345fe868c2b551293512a
I20260502 14:06:21.260574 14023 tablet_copy_client.cc:323] T 1fd859f7e3144efcacca4175c53ef5df P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Beginning tablet copy session from remote peer at address 127.12.158.66:33299
I20260502 14:06:21.262034 13384 tablet_copy_source_session.cc:215] T bf9cb1779b0345fe868c2b551293512a P a1ddd8ec7c554a1eadeb2aa58f7ab430: Tablet Copy: opened 0 blocks and 1 log segments
I20260502 14:06:21.262902 14018 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet bf9cb1779b0345fe868c2b551293512a. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:21.266436 13253 tablet_copy_service.cc:140] P 44217426b1314bc9824ba6f1661fb95b: Received BeginTabletCopySession request for tablet be812ec5b6a24e06bd0e507df248605e from peer 3d450e4b5d234cc08a659a2375cf9753 ({username='slave'} at 127.12.158.68:60503)
I20260502 14:06:21.266502 13253 tablet_copy_service.cc:161] P 44217426b1314bc9824ba6f1661fb95b: Beginning new tablet copy session on tablet be812ec5b6a24e06bd0e507df248605e from peer 3d450e4b5d234cc08a659a2375cf9753 at {username='slave'} at 127.12.158.68:60503: session id = 3d450e4b5d234cc08a659a2375cf9753-be812ec5b6a24e06bd0e507df248605e
I20260502 14:06:21.267192 13253 tablet_copy_source_session.cc:215] T be812ec5b6a24e06bd0e507df248605e P 44217426b1314bc9824ba6f1661fb95b: Tablet Copy: opened 0 blocks and 1 log segments
I20260502 14:06:21.267387 13252 tablet_copy_service.cc:140] P 44217426b1314bc9824ba6f1661fb95b: Received BeginTabletCopySession request for tablet b175587f4e6a42a6b7e4740bae788705 from peer 3d450e4b5d234cc08a659a2375cf9753 ({username='slave'} at 127.12.158.68:60503)
I20260502 14:06:21.267428 13252 tablet_copy_service.cc:161] P 44217426b1314bc9824ba6f1661fb95b: Beginning new tablet copy session on tablet b175587f4e6a42a6b7e4740bae788705 from peer 3d450e4b5d234cc08a659a2375cf9753 at {username='slave'} at 127.12.158.68:60503: session id = 3d450e4b5d234cc08a659a2375cf9753-b175587f4e6a42a6b7e4740bae788705
I20260502 14:06:21.267992 13252 tablet_copy_source_session.cc:215] T b175587f4e6a42a6b7e4740bae788705 P 44217426b1314bc9824ba6f1661fb95b: Tablet Copy: opened 0 blocks and 1 log segments
I20260502 14:06:21.268085 13251 tablet_copy_service.cc:140] P 44217426b1314bc9824ba6f1661fb95b: Received BeginTabletCopySession request for tablet 1fd859f7e3144efcacca4175c53ef5df from peer 3d450e4b5d234cc08a659a2375cf9753 ({username='slave'} at 127.12.158.68:60503)
I20260502 14:06:21.268121 13251 tablet_copy_service.cc:161] P 44217426b1314bc9824ba6f1661fb95b: Beginning new tablet copy session on tablet 1fd859f7e3144efcacca4175c53ef5df from peer 3d450e4b5d234cc08a659a2375cf9753 at {username='slave'} at 127.12.158.68:60503: session id = 3d450e4b5d234cc08a659a2375cf9753-1fd859f7e3144efcacca4175c53ef5df
I20260502 14:06:21.268599 13251 tablet_copy_source_session.cc:215] T 1fd859f7e3144efcacca4175c53ef5df P 44217426b1314bc9824ba6f1661fb95b: Tablet Copy: opened 0 blocks and 1 log segments
I20260502 14:06:21.268805 14015 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet be812ec5b6a24e06bd0e507df248605e. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:21.268965 14023 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1fd859f7e3144efcacca4175c53ef5df. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:21.269845 14023 tablet_copy_client.cc:806] T 1fd859f7e3144efcacca4175c53ef5df P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 0 data blocks...
I20260502 14:06:21.269845 14015 tablet_copy_client.cc:806] T be812ec5b6a24e06bd0e507df248605e P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 0 data blocks...
I20260502 14:06:21.270048 14015 tablet_copy_client.cc:670] T be812ec5b6a24e06bd0e507df248605e P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 1 WAL segments...
I20260502 14:06:21.270084 14023 tablet_copy_client.cc:670] T 1fd859f7e3144efcacca4175c53ef5df P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 1 WAL segments...
I20260502 14:06:21.270051 14025 ts_tablet_manager.cc:933] T d4c02bd85161406f918b1aee009c7294 P 3d450e4b5d234cc08a659a2375cf9753: Initiating tablet copy from peer 44217426b1314bc9824ba6f1661fb95b (127.12.158.66:33299)
I20260502 14:06:21.270478 14025 tablet_copy_client.cc:323] T d4c02bd85161406f918b1aee009c7294 P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Beginning tablet copy session from remote peer at address 127.12.158.66:33299
I20260502 14:06:21.276654 14018 tablet_copy_client.cc:806] T bf9cb1779b0345fe868c2b551293512a P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 0 data blocks...
I20260502 14:06:21.277858 13252 tablet_copy_service.cc:140] P 44217426b1314bc9824ba6f1661fb95b: Received BeginTabletCopySession request for tablet d4c02bd85161406f918b1aee009c7294 from peer 3d450e4b5d234cc08a659a2375cf9753 ({username='slave'} at 127.12.158.68:60503)
I20260502 14:06:21.277915 13252 tablet_copy_service.cc:161] P 44217426b1314bc9824ba6f1661fb95b: Beginning new tablet copy session on tablet d4c02bd85161406f918b1aee009c7294 from peer 3d450e4b5d234cc08a659a2375cf9753 at {username='slave'} at 127.12.158.68:60503: session id = 3d450e4b5d234cc08a659a2375cf9753-d4c02bd85161406f918b1aee009c7294
I20260502 14:06:21.278429 13252 tablet_copy_source_session.cc:215] T d4c02bd85161406f918b1aee009c7294 P 44217426b1314bc9824ba6f1661fb95b: Tablet Copy: opened 0 blocks and 1 log segments
I20260502 14:06:21.278724 14018 tablet_copy_client.cc:670] T bf9cb1779b0345fe868c2b551293512a P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 1 WAL segments...
I20260502 14:06:21.279021 14020 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b175587f4e6a42a6b7e4740bae788705. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:21.280743 14025 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet d4c02bd85161406f918b1aee009c7294. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:21.281977 14020 tablet_copy_client.cc:806] T b175587f4e6a42a6b7e4740bae788705 P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 0 data blocks...
I20260502 14:06:21.282078 14020 tablet_copy_client.cc:670] T b175587f4e6a42a6b7e4740bae788705 P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 1 WAL segments...
I20260502 14:06:21.292760 14025 tablet_copy_client.cc:806] T d4c02bd85161406f918b1aee009c7294 P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 0 data blocks...
I20260502 14:06:21.292945 14025 tablet_copy_client.cc:670] T d4c02bd85161406f918b1aee009c7294 P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 1 WAL segments...
I20260502 14:06:21.299350 14027 ts_tablet_manager.cc:933] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 3d450e4b5d234cc08a659a2375cf9753: Initiating tablet copy from peer a1ddd8ec7c554a1eadeb2aa58f7ab430 (127.12.158.67:33939)
I20260502 14:06:21.299801 14027 tablet_copy_client.cc:323] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Beginning tablet copy session from remote peer at address 127.12.158.67:33939
I20260502 14:06:21.318630 14015 tablet_copy_client.cc:538] T be812ec5b6a24e06bd0e507df248605e P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260502 14:06:21.319823 14015 tablet_bootstrap.cc:492] T be812ec5b6a24e06bd0e507df248605e P 3d450e4b5d234cc08a659a2375cf9753: Bootstrap starting.
I20260502 14:06:21.327788 14020 tablet_copy_client.cc:538] T b175587f4e6a42a6b7e4740bae788705 P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260502 14:06:21.327803 13384 tablet_copy_service.cc:140] P a1ddd8ec7c554a1eadeb2aa58f7ab430: Received BeginTabletCopySession request for tablet 7e3a9b987a4e4c74b1f1af634b2f81ed from peer 3d450e4b5d234cc08a659a2375cf9753 ({username='slave'} at 127.12.158.68:57109)
I20260502 14:06:21.327879 13384 tablet_copy_service.cc:161] P a1ddd8ec7c554a1eadeb2aa58f7ab430: Beginning new tablet copy session on tablet 7e3a9b987a4e4c74b1f1af634b2f81ed from peer 3d450e4b5d234cc08a659a2375cf9753 at {username='slave'} at 127.12.158.68:57109: session id = 3d450e4b5d234cc08a659a2375cf9753-7e3a9b987a4e4c74b1f1af634b2f81ed
I20260502 14:06:21.328430 13384 tablet_copy_source_session.cc:215] T 7e3a9b987a4e4c74b1f1af634b2f81ed P a1ddd8ec7c554a1eadeb2aa58f7ab430: Tablet Copy: opened 0 blocks and 1 log segments
I20260502 14:06:21.328867 14020 tablet_bootstrap.cc:492] T b175587f4e6a42a6b7e4740bae788705 P 3d450e4b5d234cc08a659a2375cf9753: Bootstrap starting.
I20260502 14:06:21.328877 14027 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 7e3a9b987a4e4c74b1f1af634b2f81ed. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:21.330317 14018 tablet_copy_client.cc:538] T bf9cb1779b0345fe868c2b551293512a P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260502 14:06:21.330569 14027 tablet_copy_client.cc:806] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 0 data blocks...
I20260502 14:06:21.330658 14027 tablet_copy_client.cc:670] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Starting download of 1 WAL segments...
I20260502 14:06:21.331019 14018 tablet_bootstrap.cc:492] T bf9cb1779b0345fe868c2b551293512a P 3d450e4b5d234cc08a659a2375cf9753: Bootstrap starting.
I20260502 14:06:21.356598 14023 tablet_copy_client.cc:538] T 1fd859f7e3144efcacca4175c53ef5df P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260502 14:06:21.357580 14023 tablet_bootstrap.cc:492] T 1fd859f7e3144efcacca4175c53ef5df P 3d450e4b5d234cc08a659a2375cf9753: Bootstrap starting.
I20260502 14:06:21.362558 14027 tablet_copy_client.cc:538] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260502 14:06:21.363392 14027 tablet_bootstrap.cc:492] T 7e3a9b987a4e4c74b1f1af634b2f81ed P 3d450e4b5d234cc08a659a2375cf9753: Bootstrap starting.
I20260502 14:06:21.378907 14025 tablet_copy_client.cc:538] T d4c02bd85161406f918b1aee009c7294 P 3d450e4b5d234cc08a659a2375cf9753: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260502 14:06:21.379984 14025 tablet_bootstrap.cc:492] T d4c02bd85161406f918b1aee009c7294 P 3d450e4b5d234cc08a659a2375cf9753: Bootstrap starting.
I20260502 14:06:21.453364 14015 log.cc:826] T be812ec5b6a24e06bd0e507df248605e P 3d450e4b5d234cc08a659a2375cf9753: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:21.455284 12921 meta_cache.cc:1510] marking tablet server d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257) as failed
W20260502 14:06:21.455353 12921 meta_cache.cc:302] tablet 7e3a9b987a4e4c74b1f1af634b2f81ed: replica d6662a7688c84665b55623d3f97e4ae3 (127.12.158.65:44257) has failed: Network error: TS failed: Client connection negotiation failed: client connection to 127.12.158.65:44257: connect: Connection refused (error 111)
I20260502 14:06:21.520825 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 13151
I20260502 14:06:21.552062 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 13282
I20260502 14:06:21.594249 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 13472
I20260502 14:06:21.599522 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 13698
2026-05-02T14:06:21Z chronyd exiting
[ OK ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate (15433 ms)
[----------] 1 test from MaintenanceModeRF3ITest (15433 ms total)
[----------] 1 test from RollingRestartArgs/RollingRestartITest
[ RUN ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4
2026-05-02T14:06:21Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-02T14:06:21Z Disabled control of system clock
I20260502 14:06:21.648343 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.12.158.126:36477
--webserver_interface=127.12.158.126
--webserver_port=0
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.12.158.126:36477
--location_mapping_cmd=/tmp/dist-test-taskW6FTeC/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/location-assignment.state --map /L0:4
--master_client_location_assignment_enabled=false with env {}
W20260502 14:06:21.722445 14042 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:21.722591 14042 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:21.722609 14042 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:21.724146 14042 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260502 14:06:21.724184 14042 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:21.724197 14042 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260502 14:06:21.724208 14042 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260502 14:06:21.725677 14042 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/wal
--location_mapping_cmd=/tmp/dist-test-taskW6FTeC/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/location-assignment.state --map /L0:4
--ipki_ca_key_size=768
--master_addresses=127.12.158.126:36477
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.12.158.126:36477
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.12.158.126
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.14042
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:21.725847 14042 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:21.726100 14042 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260502 14:06:21.728502 14050 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:21.728834 14047 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:21.728886 14048 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:21.728890 14042 server_base.cc:1061] running on GCE node
I20260502 14:06:21.729142 14042 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:21.729346 14042 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:21.730505 14042 hybrid_clock.cc:648] HybridClock initialized: now 1777730781730477 us; error 43 us; skew 500 ppm
I20260502 14:06:21.731740 14042 webserver.cc:492] Webserver started at http://127.12.158.126:41737/ using document root <none> and password file <none>
I20260502 14:06:21.732056 14042 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:21.732102 14042 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:21.732218 14042 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260502 14:06:21.733048 14042 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/data/instance:
uuid: "db0222d95fa840ba907de6919ae9e535"
format_stamp: "Formatted at 2026-05-02 14:06:21 on dist-test-slave-23m0"
I20260502 14:06:21.733364 14042 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/wal/instance:
uuid: "db0222d95fa840ba907de6919ae9e535"
format_stamp: "Formatted at 2026-05-02 14:06:21 on dist-test-slave-23m0"
I20260502 14:06:21.734474 14042 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.000s sys 0.002s
I20260502 14:06:21.735116 14056 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:21.735283 14042 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20260502 14:06:21.735357 14042 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/wal
uuid: "db0222d95fa840ba907de6919ae9e535"
format_stamp: "Formatted at 2026-05-02 14:06:21 on dist-test-slave-23m0"
I20260502 14:06:21.735430 14042 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:21.752503 14042 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:21.752764 14042 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:21.752890 14042 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:21.756683 14042 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.126:36477
I20260502 14:06:21.756711 14108 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.126:36477 every 8 connection(s)
I20260502 14:06:21.757009 14042 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/data/info.pb
I20260502 14:06:21.757508 14109 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:21.759601 14109 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535: Bootstrap starting.
I20260502 14:06:21.760190 14109 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:21.760444 14109 log.cc:826] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:21.761044 14109 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535: No bootstrap required, opened a new log
I20260502 14:06:21.761945 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 14042
I20260502 14:06:21.762044 12921 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/master-0/wal/instance
I20260502 14:06:21.762418 14109 raft_consensus.cc:359] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "db0222d95fa840ba907de6919ae9e535" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 36477 } }
I20260502 14:06:21.762585 14109 raft_consensus.cc:385] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:21.762624 14109 raft_consensus.cc:740] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: db0222d95fa840ba907de6919ae9e535, State: Initialized, Role: FOLLOWER
I20260502 14:06:21.762735 14109 consensus_queue.cc:260] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "db0222d95fa840ba907de6919ae9e535" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 36477 } }
I20260502 14:06:21.762815 14109 raft_consensus.cc:399] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260502 14:06:21.762849 14109 raft_consensus.cc:493] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260502 14:06:21.762908 14109 raft_consensus.cc:3060] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:21.763382 14109 raft_consensus.cc:515] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "db0222d95fa840ba907de6919ae9e535" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 36477 } }
I20260502 14:06:21.763471 14109 leader_election.cc:304] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: db0222d95fa840ba907de6919ae9e535; no voters:
I20260502 14:06:21.763638 14109 leader_election.cc:290] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [CANDIDATE]: Term 1 election: Requested vote from peers
I20260502 14:06:21.763682 14114 raft_consensus.cc:2804] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [term 1 FOLLOWER]: Leader election won for term 1
I20260502 14:06:21.763882 14114 raft_consensus.cc:697] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [term 1 LEADER]: Becoming Leader. State: Replica: db0222d95fa840ba907de6919ae9e535, State: Running, Role: LEADER
I20260502 14:06:21.763882 14109 sys_catalog.cc:565] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [sys.catalog]: configured and running, proceeding with master startup.
I20260502 14:06:21.764007 14114 consensus_queue.cc:237] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "db0222d95fa840ba907de6919ae9e535" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 36477 } }
I20260502 14:06:21.764431 14115 sys_catalog.cc:455] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "db0222d95fa840ba907de6919ae9e535" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "db0222d95fa840ba907de6919ae9e535" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 36477 } } }
I20260502 14:06:21.764513 14115 sys_catalog.cc:458] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [sys.catalog]: This master's current role is: LEADER
I20260502 14:06:21.764676 14121 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260502 14:06:21.765121 14121 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260502 14:06:21.765581 14116 sys_catalog.cc:455] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [sys.catalog]: SysCatalogTable state changed. Reason: New leader db0222d95fa840ba907de6919ae9e535. Latest consensus state: current_term: 1 leader_uuid: "db0222d95fa840ba907de6919ae9e535" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "db0222d95fa840ba907de6919ae9e535" member_type: VOTER last_known_addr { host: "127.12.158.126" port: 36477 } } }
I20260502 14:06:21.765676 14116 sys_catalog.cc:458] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535 [sys.catalog]: This master's current role is: LEADER
I20260502 14:06:21.766923 14121 catalog_manager.cc:1357] Generated new cluster ID: 68b3d523ee254d10bfd68be33af77117
I20260502 14:06:21.766973 14121 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260502 14:06:21.792757 14121 catalog_manager.cc:1380] Generated new certificate authority record
I20260502 14:06:21.793282 14121 catalog_manager.cc:1514] Loading token signing keys...
I20260502 14:06:21.797773 14121 catalog_manager.cc:6044] T 00000000000000000000000000000000 P db0222d95fa840ba907de6919ae9e535: Generated new TSK 0
I20260502 14:06:21.797946 14121 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260502 14:06:21.805101 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.65:0
--local_ip_for_outbound_sockets=127.12.158.65
--webserver_interface=127.12.158.65
--webserver_port=0
--tserver_master_addrs=127.12.158.126:36477
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:21.888972 14133 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:21.889127 14133 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:21.889147 14133 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:21.890612 14133 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:21.890666 14133 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.65
I20260502 14:06:21.892208 14133 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.65:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.12.158.65
--webserver_port=0
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.14133
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.65
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:21.892417 14133 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:21.892647 14133 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:21.893276 14133 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:21.895058 14139 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:21.895059 14141 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:21.895049 14138 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:21.895465 14133 server_base.cc:1061] running on GCE node
I20260502 14:06:21.895613 14133 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:21.895833 14133 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:21.896999 14133 hybrid_clock.cc:648] HybridClock initialized: now 1777730781896979 us; error 33 us; skew 500 ppm
I20260502 14:06:21.898203 14133 webserver.cc:492] Webserver started at http://127.12.158.65:41631/ using document root <none> and password file <none>
I20260502 14:06:21.898427 14133 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:21.898480 14133 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:21.898604 14133 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260502 14:06:21.899925 14133 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/instance:
uuid: "c9dd32be290b436fb7f776b6f481451b"
format_stamp: "Formatted at 2026-05-02 14:06:21 on dist-test-slave-23m0"
I20260502 14:06:21.900331 14133 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal/instance:
uuid: "c9dd32be290b436fb7f776b6f481451b"
format_stamp: "Formatted at 2026-05-02 14:06:21 on dist-test-slave-23m0"
I20260502 14:06:21.901854 14133 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.000s sys 0.002s
I20260502 14:06:21.902665 14147 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:21.902946 14133 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:21.903039 14133 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
uuid: "c9dd32be290b436fb7f776b6f481451b"
format_stamp: "Formatted at 2026-05-02 14:06:21 on dist-test-slave-23m0"
I20260502 14:06:21.903100 14133 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:21.929224 14133 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:21.929546 14133 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:21.929687 14133 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:21.929929 14133 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:21.930299 14133 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:21.930343 14133 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:21.930374 14133 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:21.930401 14133 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:21.937032 14133 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.65:34909
I20260502 14:06:21.937114 14260 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.65:34909 every 8 connection(s)
I20260502 14:06:21.937377 14133 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
I20260502 14:06:21.940160 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 14133
I20260502 14:06:21.940299 12921 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal/instance
I20260502 14:06:21.941500 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.66:0
--local_ip_for_outbound_sockets=127.12.158.66
--webserver_interface=127.12.158.66
--webserver_port=0
--tserver_master_addrs=127.12.158.126:36477
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260502 14:06:21.942888 14261 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:21.942999 14261 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:21.943181 14261 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:21.990688 14073 ts_manager.cc:194] Registered new tserver with Master: c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909)
I20260502 14:06:21.991454 14073 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.65:59731
W20260502 14:06:22.025630 14264 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:22.025813 14264 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:22.025846 14264 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:22.027305 14264 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:22.027386 14264 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.66
I20260502 14:06:22.028894 14264 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.66:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.12.158.66
--webserver_port=0
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.14264
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.66
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:22.029129 14264 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:22.029342 14264 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:22.030004 14264 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
I20260502 14:06:22.031965 14264 server_base.cc:1061] running on GCE node
W20260502 14:06:22.031927 14273 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:22.031919 14270 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:22.031919 14271 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:22.032325 14264 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:22.032546 14264 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:22.033707 14264 hybrid_clock.cc:648] HybridClock initialized: now 1777730782033673 us; error 40 us; skew 500 ppm
I20260502 14:06:22.034668 14264 webserver.cc:492] Webserver started at http://127.12.158.66:33233/ using document root <none> and password file <none>
I20260502 14:06:22.034880 14264 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:22.034947 14264 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:22.035053 14264 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260502 14:06:22.035933 14264 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/instance:
uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:22.036244 14264 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal/instance:
uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:22.037379 14264 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:22.038049 14279 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:22.038266 14264 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20260502 14:06:22.038355 14264 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:22.038431 14264 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:22.053997 14264 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:22.054275 14264 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:22.054407 14264 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:22.054613 14264 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:22.054930 14264 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:22.054986 14264 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:22.055029 14264 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:22.055053 14264 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:22.060544 14264 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.66:42349
I20260502 14:06:22.060617 14392 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.66:42349 every 8 connection(s)
I20260502 14:06:22.060918 14264 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
I20260502 14:06:22.065531 14393 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:22.065608 14393 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:22.065796 14393 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:22.066850 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 14264
I20260502 14:06:22.066934 12921 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal/instance
I20260502 14:06:22.069005 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.67:0
--local_ip_for_outbound_sockets=127.12.158.67
--webserver_interface=127.12.158.67
--webserver_port=0
--tserver_master_addrs=127.12.158.126:36477
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260502 14:06:22.111644 14073 ts_manager.cc:194] Registered new tserver with Master: 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349)
I20260502 14:06:22.112217 14073 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.66:55775
W20260502 14:06:22.158726 14397 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:22.158879 14397 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:22.158897 14397 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:22.160403 14397 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:22.160454 14397 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.67
I20260502 14:06:22.161952 14397 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.67:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.12.158.67
--webserver_port=0
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.14397
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.67
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:22.162143 14397 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:22.162374 14397 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:22.163089 14397 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:22.165062 14403 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:22.165086 14405 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:22.165086 14402 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:22.165277 14397 server_base.cc:1061] running on GCE node
I20260502 14:06:22.165500 14397 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:22.165746 14397 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:22.166898 14397 hybrid_clock.cc:648] HybridClock initialized: now 1777730782166882 us; error 32 us; skew 500 ppm
I20260502 14:06:22.167982 14397 webserver.cc:492] Webserver started at http://127.12.158.67:37231/ using document root <none> and password file <none>
I20260502 14:06:22.168177 14397 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:22.168239 14397 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:22.168347 14397 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260502 14:06:22.169134 14397 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/instance:
uuid: "fcb69d1fc3094c95bd74e18f784e388d"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:22.169435 14397 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal/instance:
uuid: "fcb69d1fc3094c95bd74e18f784e388d"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:22.170560 14397 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.002s sys 0.000s
I20260502 14:06:22.171250 14411 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:22.171425 14397 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20260502 14:06:22.171500 14397 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
uuid: "fcb69d1fc3094c95bd74e18f784e388d"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:22.171574 14397 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:22.183593 14397 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:22.183928 14397 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:22.184072 14397 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:22.184281 14397 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:22.184567 14397 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:22.184617 14397 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:22.184648 14397 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:22.184689 14397 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:22.190028 14397 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.67:44861
I20260502 14:06:22.190104 14524 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.67:44861 every 8 connection(s)
I20260502 14:06:22.190411 14397 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
I20260502 14:06:22.194094 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 14397
I20260502 14:06:22.194183 12921 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal/instance
I20260502 14:06:22.195145 14525 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:22.195238 14525 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:22.195360 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.68:0
--local_ip_for_outbound_sockets=127.12.158.68
--webserver_interface=127.12.158.68
--webserver_port=0
--tserver_master_addrs=127.12.158.126:36477
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260502 14:06:22.195425 14525 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:22.231247 14073 ts_manager.cc:194] Registered new tserver with Master: fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:22.231935 14073 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.67:34125
W20260502 14:06:22.272177 14528 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:22.272327 14528 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:22.272347 14528 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:22.273777 14528 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:22.273821 14528 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.68
I20260502 14:06:22.275314 14528 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.68:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.12.158.68
--webserver_port=0
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.14528
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.68
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:22.275494 14528 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:22.275662 14528 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:22.276320 14528 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:22.278023 14537 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:22.278023 14534 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:22.278115 14535 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:22.278189 14528 server_base.cc:1061] running on GCE node
I20260502 14:06:22.278437 14528 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:22.278597 14528 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:22.279726 14528 hybrid_clock.cc:648] HybridClock initialized: now 1777730782279717 us; error 30 us; skew 500 ppm
I20260502 14:06:22.280865 14528 webserver.cc:492] Webserver started at http://127.12.158.68:44081/ using document root <none> and password file <none>
I20260502 14:06:22.281073 14528 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:22.281139 14528 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:22.281252 14528 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260502 14:06:22.282086 14528 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/instance:
uuid: "bd0a653794c34d9591e2d5c89c802493"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:22.282404 14528 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal/instance:
uuid: "bd0a653794c34d9591e2d5c89c802493"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:22.283535 14528 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.002s sys 0.000s
I20260502 14:06:22.284317 14543 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:22.284482 14528 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:22.284559 14528 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
uuid: "bd0a653794c34d9591e2d5c89c802493"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:22.284632 14528 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:22.309448 14528 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:22.309811 14528 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:22.309954 14528 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:22.310154 14528 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:22.310447 14528 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:22.310500 14528 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:22.310540 14528 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:22.310583 14528 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:22.315919 14528 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.68:37147
I20260502 14:06:22.315973 14656 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.68:37147 every 8 connection(s)
I20260502 14:06:22.316229 14528 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
I20260502 14:06:22.319976 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 14528
I20260502 14:06:22.320091 12921 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal/instance
I20260502 14:06:22.320209 14657 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:22.320293 14657 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:22.320482 14657 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:22.353199 14072 ts_manager.cc:194] Registered new tserver with Master: bd0a653794c34d9591e2d5c89c802493 (127.12.158.68:37147)
I20260502 14:06:22.353735 14072 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.68:34363
I20260502 14:06:22.354111 12921 external_mini_cluster.cc:949] 4 TS(s) registered with all masters
I20260502 14:06:22.365893 14072 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:33914:
name: "test-workload"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
I20260502 14:06:22.372632 14459 tablet_service.cc:1511] Processing CreateTablet for tablet 4710965d239c4545af4272cddced8dfb (DEFAULT_TABLE table=test-workload [id=198a5c74696b4d95ba10704c64e5d4f5]), partition=RANGE (key) PARTITION UNBOUNDED
I20260502 14:06:22.372928 14459 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 4710965d239c4545af4272cddced8dfb. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:22.373014 14327 tablet_service.cc:1511] Processing CreateTablet for tablet 4710965d239c4545af4272cddced8dfb (DEFAULT_TABLE table=test-workload [id=198a5c74696b4d95ba10704c64e5d4f5]), partition=RANGE (key) PARTITION UNBOUNDED
I20260502 14:06:22.373200 14327 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 4710965d239c4545af4272cddced8dfb. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:22.374738 14680 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap starting.
I20260502 14:06:22.375429 14680 tablet_bootstrap.cc:654] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:22.375406 14681 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap starting.
I20260502 14:06:22.375541 14195 tablet_service.cc:1511] Processing CreateTablet for tablet 4710965d239c4545af4272cddced8dfb (DEFAULT_TABLE table=test-workload [id=198a5c74696b4d95ba10704c64e5d4f5]), partition=RANGE (key) PARTITION UNBOUNDED
I20260502 14:06:22.375720 14680 log.cc:826] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:22.375821 14195 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 4710965d239c4545af4272cddced8dfb. 1 dirs total, 0 dirs full, 0 dirs failed
I20260502 14:06:22.376047 14681 tablet_bootstrap.cc:654] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:22.376328 14681 log.cc:826] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:22.376447 14680 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: No bootstrap required, opened a new log
I20260502 14:06:22.376518 14680 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent bootstrapping tablet: real 0.002s user 0.001s sys 0.000s
I20260502 14:06:22.376959 14681 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: No bootstrap required, opened a new log
I20260502 14:06:22.377023 14681 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent bootstrapping tablet: real 0.002s user 0.002s sys 0.000s
I20260502 14:06:22.377770 14680 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:22.377856 14680 raft_consensus.cc:385] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:22.377874 14680 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: fcb69d1fc3094c95bd74e18f784e388d, State: Initialized, Role: FOLLOWER
I20260502 14:06:22.377978 14680 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:22.378197 14680 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent starting tablet: real 0.002s user 0.002s sys 0.000s
I20260502 14:06:22.378206 14684 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap starting.
I20260502 14:06:22.378302 14525 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:22.378576 14681 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:22.378690 14681 raft_consensus.cc:385] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:22.378722 14681 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8b5f3bbadc484c728f336c95a8d8fd78, State: Initialized, Role: FOLLOWER
I20260502 14:06:22.378782 14684 tablet_bootstrap.cc:654] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Neither blocks nor log segments found. Creating new log.
I20260502 14:06:22.378803 14681 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:22.379046 14681 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent starting tablet: real 0.002s user 0.002s sys 0.000s
I20260502 14:06:22.379048 14684 log.cc:826] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:22.379101 14393 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:22.379611 14684 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: No bootstrap required, opened a new log
I20260502 14:06:22.379686 14684 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent bootstrapping tablet: real 0.002s user 0.001s sys 0.000s
I20260502 14:06:22.380997 14684 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:22.381090 14684 raft_consensus.cc:385] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260502 14:06:22.381109 14684 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Initialized, Role: FOLLOWER
I20260502 14:06:22.381225 14684 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:22.381429 14684 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent starting tablet: real 0.002s user 0.000s sys 0.002s
I20260502 14:06:22.381491 14261 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:22.390551 14688 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:22.390650 14688 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:22.390938 14688 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:22.393741 14347 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" is_pre_election: true
I20260502 14:06:22.393872 14347 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 0.
I20260502 14:06:22.394073 14149 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78, c9dd32be290b436fb7f776b6f481451b; no voters:
I20260502 14:06:22.394126 14479 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
I20260502 14:06:22.394253 14479 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 0.
I20260502 14:06:22.394279 14688 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260502 14:06:22.394433 14688 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:06:22.394480 14688 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:22.395008 14688 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:22.395128 14688 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 1 election: Requested vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:22.395287 14347 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
I20260502 14:06:22.395315 14479 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
I20260502 14:06:22.395362 14347 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:22.395388 14479 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 0 FOLLOWER]: Advancing to term 1
I20260502 14:06:22.396008 14347 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 1.
I20260502 14:06:22.396025 14479 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 1.
I20260502 14:06:22.396185 14149 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78, c9dd32be290b436fb7f776b6f481451b; no voters:
I20260502 14:06:22.396286 14688 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 1 FOLLOWER]: Leader election won for term 1
I20260502 14:06:22.396404 14688 raft_consensus.cc:697] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 1 LEADER]: Becoming Leader. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Running, Role: LEADER
I20260502 14:06:22.396497 14688 consensus_queue.cc:237] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:22.397418 14072 catalog_manager.cc:5671] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b reported cstate change: term changed from 0 to 1, leader changed from <none> to c9dd32be290b436fb7f776b6f481451b (127.12.158.65). New cstate: current_term: 1 leader_uuid: "c9dd32be290b436fb7f776b6f481451b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } health_report { overall_health: UNKNOWN } } }
I20260502 14:06:22.400291 12921 maintenance_mode-itest.cc:745] Restarting batch of 4 tservers: bd0a653794c34d9591e2d5c89c802493,8b5f3bbadc484c728f336c95a8d8fd78,c9dd32be290b436fb7f776b6f481451b,fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:06:22.436770 14479 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 1 FOLLOWER]: Refusing update from remote peer c9dd32be290b436fb7f776b6f481451b: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:22.437186 14688 consensus_queue.cc:1048] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Connected to new peer: Peer: permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20260502 14:06:22.437997 14262 tablet.cc:2404] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260502 14:06:22.439574 14347 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 1 FOLLOWER]: Refusing update from remote peer c9dd32be290b436fb7f776b6f481451b: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260502 14:06:22.439883 14688 consensus_queue.cc:1048] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Connected to new peer: Peer: permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20260502 14:06:22.441344 14526 tablet.cc:2404] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260502 14:06:22.444227 14707 mvcc.cc:204] Tried to move back new op lower bound from 7281585284858380288 to 7281585284697395200. Current Snapshot: MvccSnapshot[applied={T|T < 7281585284858380288}]
I20260502 14:06:22.445299 14710 mvcc.cc:204] Tried to move back new op lower bound from 7281585284858380288 to 7281585284697395200. Current Snapshot: MvccSnapshot[applied={T|T < 7281585284858380288}]
I20260502 14:06:22.446904 14708 mvcc.cc:204] Tried to move back new op lower bound from 7281585284858380288 to 7281585284697395200. Current Snapshot: MvccSnapshot[applied={T|T < 7281585284858380288}]
I20260502 14:06:22.551492 14069 ts_manager.cc:295] Set tserver state for bd0a653794c34d9591e2d5c89c802493 to MAINTENANCE_MODE
I20260502 14:06:22.602618 14069 ts_manager.cc:295] Set tserver state for 8b5f3bbadc484c728f336c95a8d8fd78 to MAINTENANCE_MODE
I20260502 14:06:22.626120 14069 ts_manager.cc:295] Set tserver state for c9dd32be290b436fb7f776b6f481451b to MAINTENANCE_MODE
I20260502 14:06:22.643844 14069 ts_manager.cc:295] Set tserver state for fcb69d1fc3094c95bd74e18f784e388d to MAINTENANCE_MODE
I20260502 14:06:22.873919 14195 tablet_service.cc:1460] Tablet server c9dd32be290b436fb7f776b6f481451b set to quiescing
I20260502 14:06:22.873987 14195 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260502 14:06:22.892701 14762 raft_consensus.cc:993] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: : Instructing follower 8b5f3bbadc484c728f336c95a8d8fd78 to start an election
I20260502 14:06:22.892779 14762 raft_consensus.cc:1081] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 1 LEADER]: Signalling peer 8b5f3bbadc484c728f336c95a8d8fd78 to start an election
I20260502 14:06:22.894002 14346 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "4710965d239c4545af4272cddced8dfb"
dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
from {username='slave'} at 127.12.158.65:49595
I20260502 14:06:22.894099 14346 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 1 FOLLOWER]: Starting forced leader election (received explicit request)
I20260502 14:06:22.894131 14346 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 1 FOLLOWER]: Advancing to term 2
I20260502 14:06:22.894920 14346 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 2 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:22.895146 14346 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 2 election: Requested vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:22.896251 14346 raft_consensus.cc:1240] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 2 FOLLOWER]: Rejecting Update request from peer c9dd32be290b436fb7f776b6f481451b for earlier term 1. Current term is 2. Ops: [1.252-1.252]
I20260502 14:06:22.898411 14690 consensus_queue.cc:1059] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 }, Status: INVALID_TERM, Last received: 1.251, Next index: 252, Last known committed idx: 249, Time since last communication: 0.000s
I20260502 14:06:22.898861 14761 raft_consensus.cc:3055] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 1 LEADER]: Stepping down as leader of term 1
I20260502 14:06:22.898942 14761 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 1 LEADER]: Becoming Follower/Learner. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Running, Role: LEADER
I20260502 14:06:22.899061 14761 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 251, Committed index: 251, Last appended: 1.254, Last appended by leader: 254, Current term: 1, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:22.899346 14761 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 1 FOLLOWER]: Advancing to term 2
I20260502 14:06:22.903617 14215 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 2 candidate_status { last_received { term: 1 index: 251 } } ignore_live_leader: true dest_uuid: "c9dd32be290b436fb7f776b6f481451b"
I20260502 14:06:22.903740 14215 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate 8b5f3bbadc484c728f336c95a8d8fd78 for term 2 because replica has last-logged OpId of term: 1 index: 254, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 251.
I20260502 14:06:22.904724 14479 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 2 candidate_status { last_received { term: 1 index: 251 } } ignore_live_leader: true dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
I20260502 14:06:22.904871 14479 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 1 FOLLOWER]: Advancing to term 2
I20260502 14:06:22.905624 14479 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate 8b5f3bbadc484c728f336c95a8d8fd78 for term 2 because replica has last-logged OpId of term: 1 index: 253, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 251.
I20260502 14:06:22.905855 14281 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78; no voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:06:22.905995 14686 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
I20260502 14:06:22.908828 14327 tablet_service.cc:1460] Tablet server 8b5f3bbadc484c728f336c95a8d8fd78 set to quiescing
I20260502 14:06:22.908895 14327 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:22.925554 14459 tablet_service.cc:1460] Tablet server fcb69d1fc3094c95bd74e18f784e388d set to quiescing
I20260502 14:06:22.925622 14459 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:22.940277 14591 tablet_service.cc:1460] Tablet server bd0a653794c34d9591e2d5c89c802493 set to quiescing
I20260502 14:06:22.940353 14591 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260502 14:06:23.198124 14810 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:23.241770 14761 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:23.351549 14686 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: failed to trigger leader election: Illegal state: leader elections are disabled
I20260502 14:06:23.354682 14657 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:24.023468 14195 tablet_service.cc:1460] Tablet server c9dd32be290b436fb7f776b6f481451b set to quiescing
I20260502 14:06:24.023535 14195 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:24.079569 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 14528
I20260502 14:06:24.084218 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.68:37147
--local_ip_for_outbound_sockets=127.12.158.68
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=44081
--webserver_interface=127.12.158.68
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:24.160575 14822 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:24.160727 14822 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:24.160749 14822 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:24.162151 14822 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:24.162195 14822 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.68
I20260502 14:06:24.163627 14822 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.68:37147
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.12.158.68
--webserver_port=44081
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.14822
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.68
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:24.163848 14822 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:24.164032 14822 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:24.164614 14822 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:24.166466 14827 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:24.166440 14828 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:24.166638 14822 server_base.cc:1061] running on GCE node
W20260502 14:06:24.166448 14830 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:24.166878 14822 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:24.167035 14822 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:24.168155 14822 hybrid_clock.cc:648] HybridClock initialized: now 1777730784168136 us; error 34 us; skew 500 ppm
I20260502 14:06:24.169145 14822 webserver.cc:492] Webserver started at http://127.12.158.68:44081/ using document root <none> and password file <none>
I20260502 14:06:24.169338 14822 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:24.169392 14822 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:24.170506 14822 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:24.171126 14836 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:24.171322 14822 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:24.171375 14822 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
uuid: "bd0a653794c34d9591e2d5c89c802493"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:24.171618 14822 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:24.179975 14822 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:24.180203 14822 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:24.180325 14822 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:24.180517 14822 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:24.180797 14822 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:24.180828 14822 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:24.180876 14822 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:24.180903 14822 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:24.186275 14822 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.68:37147
I20260502 14:06:24.186322 14949 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.68:37147 every 8 connection(s)
I20260502 14:06:24.186587 14822 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
I20260502 14:06:24.188350 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 14822
I20260502 14:06:24.188465 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 14264
I20260502 14:06:24.191643 14950 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:24.191777 14950 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:24.191967 14950 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:24.192335 14069 ts_manager.cc:194] Re-registered known tserver with Master: bd0a653794c34d9591e2d5c89c802493 (127.12.158.68:37147)
I20260502 14:06:24.192696 14069 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.68:47071
I20260502 14:06:24.196147 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.66:42349
--local_ip_for_outbound_sockets=127.12.158.66
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=33233
--webserver_interface=127.12.158.66
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:24.270015 14954 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:24.270188 14954 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:24.270219 14954 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:24.271679 14954 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:24.271764 14954 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.66
I20260502 14:06:24.273293 14954 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.66:42349
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.12.158.66
--webserver_port=33233
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.14954
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.66
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:24.273530 14954 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:24.273751 14954 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:24.274391 14954 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:24.276224 14959 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:24.276229 14962 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:24.276322 14960 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:24.276494 14954 server_base.cc:1061] running on GCE node
I20260502 14:06:24.276615 14954 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:24.276782 14954 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:24.277911 14954 hybrid_clock.cc:648] HybridClock initialized: now 1777730784277900 us; error 23 us; skew 500 ppm
I20260502 14:06:24.278956 14954 webserver.cc:492] Webserver started at http://127.12.158.66:33233/ using document root <none> and password file <none>
I20260502 14:06:24.279109 14954 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:24.279174 14954 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:24.280403 14954 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:24.281006 14968 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:24.281199 14954 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20260502 14:06:24.281261 14954 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:24.281483 14954 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:24.292788 14954 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:24.292989 14954 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:24.293077 14954 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:24.293212 14954 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:24.293587 14975 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:24.294354 14954 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:24.294394 14954 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:24.294414 14954 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:24.294930 14954 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:24.294961 14954 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:24.295094 14975 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap starting.
I20260502 14:06:24.301414 14954 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.66:42349
I20260502 14:06:24.301455 15082 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.66:42349 every 8 connection(s)
I20260502 14:06:24.301712 14954 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
I20260502 14:06:24.306559 15083 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:24.306651 15083 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:24.306839 15083 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:24.307397 14069 ts_manager.cc:194] Re-registered known tserver with Master: 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349)
I20260502 14:06:24.307865 14069 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.66:40241
I20260502 14:06:24.310021 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 14954
I20260502 14:06:24.310113 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 14133
I20260502 14:06:24.311873 14975 log.cc:826] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Log is configured to *not* fsync() on all Append() calls
W20260502 14:06:24.317039 14666 meta_cache.cc:302] tablet 4710965d239c4545af4272cddced8dfb: replica c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909) has failed: Network error: recv got EOF from 127.12.158.65:34909 (error 108)
I20260502 14:06:24.317521 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.65:34909
--local_ip_for_outbound_sockets=127.12.158.65
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=41631
--webserver_interface=127.12.158.65
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:24.326900 14438 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48342: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:24.347028 14438 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48342: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:24.370428 14975 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 1/1 log segments. Stats: ops{read=251 overwritten=0 applied=249 ignored=0} inserts{seen=12400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260502 14:06:24.370787 14975 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap complete.
I20260502 14:06:24.371930 14975 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent bootstrapping tablet: real 0.077s user 0.057s sys 0.019s
I20260502 14:06:24.373051 14975 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 2 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:24.374259 14975 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8b5f3bbadc484c728f336c95a8d8fd78, State: Initialized, Role: FOLLOWER
I20260502 14:06:24.374402 14975 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 249, Last appended: 1.251, Last appended by leader: 251, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:24.374624 14975 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent starting tablet: real 0.003s user 0.002s sys 0.001s
I20260502 14:06:24.374667 15083 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
W20260502 14:06:24.378109 14438 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48342: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:24.399858 14995 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:24.411669 15087 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:24.411870 15087 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:24.411895 15087 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:24.413262 15087 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:24.413306 15087 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.65
I20260502 14:06:24.414695 15087 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.65:34909
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.12.158.65
--webserver_port=41631
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.15087
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.65
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:24.414860 15087 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:24.415020 15087 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260502 14:06:24.415527 14438 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48342: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
E20260502 14:06:24.415714 15087 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:24.417613 15097 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:24.417586 15099 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:24.417745 15087 server_base.cc:1061] running on GCE node
W20260502 14:06:24.417567 15096 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:24.418020 15087 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:24.418246 15087 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:24.419394 15087 hybrid_clock.cc:648] HybridClock initialized: now 1777730784419371 us; error 32 us; skew 500 ppm
I20260502 14:06:24.420465 15087 webserver.cc:492] Webserver started at http://127.12.158.65:41631/ using document root <none> and password file <none>
I20260502 14:06:24.420670 15087 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:24.420751 15087 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:24.421891 15087 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:24.422513 15105 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:24.422673 15087 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20260502 14:06:24.422748 15087 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
uuid: "c9dd32be290b436fb7f776b6f481451b"
format_stamp: "Formatted at 2026-05-02 14:06:21 on dist-test-slave-23m0"
I20260502 14:06:24.423066 15087 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:24.445708 15087 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:24.445999 15087 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:24.446130 15087 kserver.cc:163] Server-wide thread pool size limit: 3276
W20260502 14:06:24.446148 14995 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:24.446332 15087 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:24.446761 15112 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:24.447624 15087 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:24.447683 15087 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:24.447724 15087 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:24.448259 15087 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:24.448313 15087 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:24.448336 15112 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap starting.
I20260502 14:06:24.454175 15087 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.65:34909
I20260502 14:06:24.454226 15219 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.65:34909 every 8 connection(s)
I20260502 14:06:24.454489 15087 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
I20260502 14:06:24.459311 15220 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:24.459403 15220 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:24.459599 15220 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:24.460215 14069 ts_manager.cc:194] Re-registered known tserver with Master: c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909)
I20260502 14:06:24.460593 14069 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.65:35591
I20260502 14:06:24.462457 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 15087
I20260502 14:06:24.462533 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 14397
I20260502 14:06:24.464262 15112 log.cc:826] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Log is configured to *not* fsync() on all Append() calls
W20260502 14:06:24.467703 14667 connection.cc:570] client connection to 127.12.158.67:44861 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20260502 14:06:24.468281 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.67:44861
--local_ip_for_outbound_sockets=127.12.158.67
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=37231
--webserver_interface=127.12.158.67
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:24.484879 14995 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:24.489032 14995 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:24.504446 15112 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 1/1 log segments. Stats: ops{read=254 overwritten=0 applied=251 ignored=0} inserts{seen=12500 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260502 14:06:24.504804 15112 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap complete.
I20260502 14:06:24.505990 15112 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent bootstrapping tablet: real 0.058s user 0.041s sys 0.014s
I20260502 14:06:24.506990 15112 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 2 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:24.507617 15112 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Initialized, Role: FOLLOWER
I20260502 14:06:24.507786 15112 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 251, Last appended: 1.254, Last appended by leader: 254, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:24.508003 15112 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent starting tablet: real 0.002s user 0.004s sys 0.000s
I20260502 14:06:24.508066 15220 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
W20260502 14:06:24.509035 14995 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:24.525588 14995 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:24.548954 15224 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:24.549142 15224 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:24.549175 15224 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:24.550617 15224 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:24.550697 15224 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.67
I20260502 14:06:24.552278 15224 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.67:44861
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.12.158.67
--webserver_port=37231
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.15224
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.67
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:24.552515 15224 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:24.552731 15224 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:24.553376 15224 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:24.555197 15232 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:24.555171 15233 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:24.555235 15235 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:24.555316 15224 server_base.cc:1061] running on GCE node
I20260502 14:06:24.555532 15224 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:24.555693 15224 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:24.556828 15224 hybrid_clock.cc:648] HybridClock initialized: now 1777730784556814 us; error 25 us; skew 500 ppm
I20260502 14:06:24.557827 15224 webserver.cc:492] Webserver started at http://127.12.158.67:37231/ using document root <none> and password file <none>
I20260502 14:06:24.558023 15224 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:24.558080 15224 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:24.559234 15224 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:24.559885 15241 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:24.560065 15224 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:24.560143 15224 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
uuid: "fcb69d1fc3094c95bd74e18f784e388d"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:24.560392 15224 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:24.587718 15224 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:24.588032 15224 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:24.588173 15224 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:24.588387 15224 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:24.588799 15248 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:24.589534 15224 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:24.589591 15224 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:24.589637 15224 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:24.590130 15224 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:24.590183 15224 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:24.590205 15248 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap starting.
I20260502 14:06:24.596870 15224 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.67:44861
I20260502 14:06:24.596921 15355 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.67:44861 every 8 connection(s)
I20260502 14:06:24.597201 15224 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
I20260502 14:06:24.601987 15356 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:24.602092 15356 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:24.602301 15356 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:24.602573 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 15224
I20260502 14:06:24.602891 14068 ts_manager.cc:194] Re-registered known tserver with Master: fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:24.603401 14068 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.67:39909
I20260502 14:06:24.605764 15248 log.cc:826] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:24.640048 15248 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 1/1 log segments. Stats: ops{read=253 overwritten=0 applied=251 ignored=0} inserts{seen=12500 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260502 14:06:24.640376 15248 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap complete.
I20260502 14:06:24.641287 15248 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent bootstrapping tablet: real 0.051s user 0.041s sys 0.008s
I20260502 14:06:24.642122 15248 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 2 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:24.642714 15248 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: fcb69d1fc3094c95bd74e18f784e388d, State: Initialized, Role: FOLLOWER
I20260502 14:06:24.642884 15248 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 251, Last appended: 1.253, Last appended by leader: 253, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:24.643124 15248 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent starting tablet: real 0.002s user 0.003s sys 0.000s
I20260502 14:06:24.643194 15356 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:24.649246 15091 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:24.649335 15091 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:24.649706 15091 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:24.652647 15174 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 3 candidate_status { last_received { term: 1 index: 251 } } ignore_live_leader: false dest_uuid: "c9dd32be290b436fb7f776b6f481451b" is_pre_election: true
I20260502 14:06:24.652796 15174 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 2 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 8b5f3bbadc484c728f336c95a8d8fd78 for term 3 because replica has last-logged OpId of term: 1 index: 254, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 251.
I20260502 14:06:24.653982 15310 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 3 candidate_status { last_received { term: 1 index: 251 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
I20260502 14:06:24.654109 15310 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 2 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 8b5f3bbadc484c728f336c95a8d8fd78 for term 3 because replica has last-logged OpId of term: 1 index: 253, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 251.
I20260502 14:06:24.654369 14970 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78; no voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:06:24.654505 15091 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20260502 14:06:24.716441 15154 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:24.716506 15290 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:24.716464 14884 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:24.716714 15017 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:24.771237 15227 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:24.771348 15227 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:24.771584 15227 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:24.774493 15310 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 3 candidate_status { last_received { term: 1 index: 254 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
I20260502 14:06:24.774477 15037 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 3 candidate_status { last_received { term: 1 index: 254 } } ignore_live_leader: false dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" is_pre_election: true
I20260502 14:06:24.774601 15037 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 2.
I20260502 14:06:24.774631 15310 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 2.
I20260502 14:06:24.774803 15107 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d; no voters:
I20260502 14:06:24.774948 15227 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 2 FOLLOWER]: Leader pre-election won for term 3
I20260502 14:06:24.775025 15227 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 2 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:06:24.775043 15227 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 2 FOLLOWER]: Advancing to term 3
I20260502 14:06:24.776010 15227 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 3 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:24.776108 15227 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 3 election: Requested vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:24.776404 15310 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 3 candidate_status { last_received { term: 1 index: 254 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
I20260502 14:06:24.776404 15037 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 3 candidate_status { last_received { term: 1 index: 254 } } ignore_live_leader: false dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
I20260502 14:06:24.776489 15037 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 2 FOLLOWER]: Advancing to term 3
I20260502 14:06:24.776489 15310 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 2 FOLLOWER]: Advancing to term 3
I20260502 14:06:24.777571 15037 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 3.
I20260502 14:06:24.777571 15310 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 3.
I20260502 14:06:24.777767 15107 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d; no voters:
I20260502 14:06:24.777911 15227 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 3 FOLLOWER]: Leader election won for term 3
I20260502 14:06:24.778090 15227 raft_consensus.cc:697] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 3 LEADER]: Becoming Leader. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Running, Role: LEADER
I20260502 14:06:24.778258 15227 consensus_queue.cc:237] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 251, Committed index: 251, Last appended: 1.254, Last appended by leader: 254, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:24.778944 14069 catalog_manager.cc:5671] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b reported cstate change: term changed from 1 to 3. New cstate: current_term: 3 leader_uuid: "c9dd32be290b436fb7f776b6f481451b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } health_report { overall_health: UNKNOWN } } }
I20260502 14:06:24.870265 15037 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 3 FOLLOWER]: Refusing update from remote peer c9dd32be290b436fb7f776b6f481451b: Log matching property violated. Preceding OpId in replica: term: 1 index: 251. Preceding OpId from leader: term: 3 index: 255. (index mismatch)
I20260502 14:06:24.870616 15227 consensus_queue.cc:1048] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Connected to new peer: Peer: permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 255, Last known committed idx: 249, Time since last communication: 0.000s
I20260502 14:06:24.873201 15310 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 3 FOLLOWER]: Refusing update from remote peer c9dd32be290b436fb7f776b6f481451b: Log matching property violated. Preceding OpId in replica: term: 1 index: 253. Preceding OpId from leader: term: 3 index: 255. (index mismatch)
I20260502 14:06:24.873556 15227 consensus_queue.cc:1048] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Connected to new peer: Peer: permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 255, Last known committed idx: 251, Time since last communication: 0.000s
I20260502 14:06:25.193370 14950 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
W20260502 14:06:25.334503 14697 scanner-internal.cc:458] Time spent opening tablet: real 2.408s user 0.001s sys 0.000s
W20260502 14:06:25.334746 14695 scanner-internal.cc:458] Time spent opening tablet: real 2.408s user 0.000s sys 0.001s
W20260502 14:06:25.334949 14696 scanner-internal.cc:458] Time spent opening tablet: real 2.409s user 0.001s sys 0.000s
I20260502 14:06:29.997496 14884 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:30.001431 15290 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:30.008899 15154 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260502 14:06:30.015898 15017 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260502 14:06:30.362352 14069 ts_manager.cc:284] Unset tserver state for fcb69d1fc3094c95bd74e18f784e388d from MAINTENANCE_MODE
I20260502 14:06:30.381107 14069 ts_manager.cc:284] Unset tserver state for c9dd32be290b436fb7f776b6f481451b from MAINTENANCE_MODE
I20260502 14:06:30.496379 14069 ts_manager.cc:284] Unset tserver state for 8b5f3bbadc484c728f336c95a8d8fd78 from MAINTENANCE_MODE
I20260502 14:06:30.501825 14069 ts_manager.cc:284] Unset tserver state for bd0a653794c34d9591e2d5c89c802493 from MAINTENANCE_MODE
I20260502 14:06:30.886703 15356 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:30.886715 15220 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:30.890621 15083 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:30.912791 14068 ts_manager.cc:295] Set tserver state for fcb69d1fc3094c95bd74e18f784e388d to MAINTENANCE_MODE
I20260502 14:06:30.928676 14068 ts_manager.cc:295] Set tserver state for bd0a653794c34d9591e2d5c89c802493 to MAINTENANCE_MODE
I20260502 14:06:30.954828 14068 ts_manager.cc:295] Set tserver state for c9dd32be290b436fb7f776b6f481451b to MAINTENANCE_MODE
I20260502 14:06:30.956146 14068 ts_manager.cc:295] Set tserver state for 8b5f3bbadc484c728f336c95a8d8fd78 to MAINTENANCE_MODE
I20260502 14:06:31.197732 14950 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:31.240057 15154 tablet_service.cc:1460] Tablet server c9dd32be290b436fb7f776b6f481451b set to quiescing
I20260502 14:06:31.240120 15154 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260502 14:06:31.253357 15476 raft_consensus.cc:993] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: : Instructing follower fcb69d1fc3094c95bd74e18f784e388d to start an election
I20260502 14:06:31.253434 15476 raft_consensus.cc:1081] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 3 LEADER]: Signalling peer fcb69d1fc3094c95bd74e18f784e388d to start an election
I20260502 14:06:31.255466 15309 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "4710965d239c4545af4272cddced8dfb"
dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
from {username='slave'} at 127.12.158.65:54307
I20260502 14:06:31.255580 15309 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 3 FOLLOWER]: Starting forced leader election (received explicit request)
I20260502 14:06:31.255611 15309 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 3 FOLLOWER]: Advancing to term 4
I20260502 14:06:31.256699 15309 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 4 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:31.257428 15309 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [CANDIDATE]: Term 4 election: Requested vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909)
I20260502 14:06:31.258033 15309 raft_consensus.cc:1240] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 4 FOLLOWER]: Rejecting Update request from peer c9dd32be290b436fb7f776b6f481451b for earlier term 3. Current term is 4. Ops: []
I20260502 14:06:31.258507 15475 consensus_queue.cc:1059] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 }, Status: INVALID_TERM, Last received: 3.6107, Next index: 6108, Last known committed idx: 6105, Time since last communication: 0.000s
I20260502 14:06:31.258780 15421 raft_consensus.cc:3055] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 3 LEADER]: Stepping down as leader of term 3
I20260502 14:06:31.258832 15421 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 3 LEADER]: Becoming Follower/Learner. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Running, Role: LEADER
I20260502 14:06:31.258891 15421 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 6107, Committed index: 6107, Last appended: 3.6108, Last appended by leader: 6108, Current term: 3, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:31.259464 15421 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 3 FOLLOWER]: Advancing to term 4
W20260502 14:06:31.260756 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.261205 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.264729 14995 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:31.265825 14884 tablet_service.cc:1460] Tablet server bd0a653794c34d9591e2d5c89c802493 set to quiescing
I20260502 14:06:31.265897 14884 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260502 14:06:31.266964 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:31.268366 15174 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "fcb69d1fc3094c95bd74e18f784e388d" candidate_term: 4 candidate_status { last_received { term: 3 index: 6107 } } ignore_live_leader: true dest_uuid: "c9dd32be290b436fb7f776b6f481451b"
I20260502 14:06:31.268469 15174 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 4 FOLLOWER]: Leader election vote request: Denying vote to candidate fcb69d1fc3094c95bd74e18f784e388d for term 4 because replica has last-logged OpId of term: 3 index: 6108, which is greater than that of the candidate, which has last-logged OpId of term: 3 index: 6107.
I20260502 14:06:31.268639 15037 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "fcb69d1fc3094c95bd74e18f784e388d" candidate_term: 4 candidate_status { last_received { term: 3 index: 6107 } } ignore_live_leader: true dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
I20260502 14:06:31.268711 15037 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 3 FOLLOWER]: Advancing to term 4
I20260502 14:06:31.269301 15037 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 4 FOLLOWER]: Leader election vote request: Denying vote to candidate fcb69d1fc3094c95bd74e18f784e388d for term 4 because replica has last-logged OpId of term: 3 index: 6108, which is greater than that of the candidate, which has last-logged OpId of term: 3 index: 6107.
I20260502 14:06:31.269523 15243 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [CANDIDATE]: Term 4 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: fcb69d1fc3094c95bd74e18f784e388d; no voters: 8b5f3bbadc484c728f336c95a8d8fd78, c9dd32be290b436fb7f776b6f481451b
I20260502 14:06:31.275259 15587 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 4 FOLLOWER]: Leader election lost for term 4. Reason: could not achieve majority
W20260502 14:06:31.281334 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.281339 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.288426 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.289814 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.295498 14996 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.296504 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:31.300046 15290 tablet_service.cc:1460] Tablet server fcb69d1fc3094c95bd74e18f784e388d set to quiescing
I20260502 14:06:31.300100 15290 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260502 14:06:31.304198 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.304371 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:31.307828 15017 tablet_service.cc:1460] Tablet server 8b5f3bbadc484c728f336c95a8d8fd78 set to quiescing
I20260502 14:06:31.307927 15017 tablet_service.cc:1467] Tablet server has 0 leaders and 2 scanners
W20260502 14:06:31.311316 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.314044 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.322602 14997 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.322623 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.334125 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.335012 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.347038 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.347911 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.360392 14997 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.362432 14997 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.373955 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.376767 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.390673 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.391010 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.404526 14997 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.407612 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.422937 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.426085 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.439913 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.444116 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.456584 14997 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.462777 14997 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.476145 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.482470 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.497212 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.501519 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.521124 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.522181 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.545876 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.546697 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.556499 15421 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:31.558774 15608 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:31.568981 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.571805 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.595425 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.595425 14997 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.620146 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.621949 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.634473 15587 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:31.646034 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.650951 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.675657 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.676628 14997 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.703332 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.704324 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.731402 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.735381 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.762989 14997 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.765075 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.793570 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.794486 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.824668 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.829334 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.858847 14997 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.863063 14997 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.893278 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.900055 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:31.930274 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.935210 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:31.968998 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:31.973070 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:32.008875 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.010416 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.048389 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:32.050228 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:32.088994 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:32.090994 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:32.129748 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.132323 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.171742 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:32.172662 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:32.215505 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:32.217442 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:32.260087 15270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.260087 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.304625 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:32.306396 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:32.352236 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:32.353276 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:32.382267 15154 tablet_service.cc:1460] Tablet server c9dd32be290b436fb7f776b6f481451b set to quiescing
I20260502 14:06:32.382320 15154 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:32.398303 15017 tablet_service.cc:1460] Tablet server 8b5f3bbadc484c728f336c95a8d8fd78 set to quiescing
I20260502 14:06:32.398371 15017 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260502 14:06:32.400038 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.401505 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.448642 15132 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:32.448642 15133 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:32.454056 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 14822
I20260502 14:06:32.459856 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.68:37147
--local_ip_for_outbound_sockets=127.12.158.68
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=44081
--webserver_interface=127.12.158.68
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:32.498659 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:32.499644 14994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:50016: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:32.536907 15631 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:32.537047 15631 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:32.537067 15631 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:32.538450 15631 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:32.538492 15631 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.68
I20260502 14:06:32.540161 15631 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.68:37147
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.12.158.68
--webserver_port=44081
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.15631
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.68
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:32.540359 15631 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:32.540580 15631 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:32.541213 15631 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:32.543200 15636 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:32.543267 15637 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:32.543249 15631 server_base.cc:1061] running on GCE node
W20260502 14:06:32.543201 15639 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:32.543530 15631 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:32.543748 15631 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:32.544919 15631 hybrid_clock.cc:648] HybridClock initialized: now 1777730792544903 us; error 53 us; skew 500 ppm
I20260502 14:06:32.545874 15631 webserver.cc:492] Webserver started at http://127.12.158.68:44081/ using document root <none> and password file <none>
I20260502 14:06:32.546058 15631 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:32.546119 15631 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:32.547298 15631 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
W20260502 14:06:32.547396 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:32.547957 15645 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:32.548120 15631 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20260502 14:06:32.548194 15631 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
uuid: "bd0a653794c34d9591e2d5c89c802493"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:32.548393 15631 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260502 14:06:32.552006 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:32.568027 15631 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:32.568311 15631 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:32.568435 15631 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:32.568619 15631 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:32.568961 15631 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:32.569015 15631 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:32.569052 15631 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:32.569077 15631 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:32.574326 15631 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.68:37147
I20260502 14:06:32.574436 15758 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.68:37147 every 8 connection(s)
I20260502 14:06:32.574684 15631 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
I20260502 14:06:32.578791 15759 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:32.578864 15759 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:32.578992 15759 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:32.579308 14068 ts_manager.cc:194] Re-registered known tserver with Master: bd0a653794c34d9591e2d5c89c802493 (127.12.158.68:37147)
I20260502 14:06:32.579614 14068 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.68:50975
I20260502 14:06:32.584512 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 15631
I20260502 14:06:32.584647 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 14954
I20260502 14:06:32.592626 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.66:42349
--local_ip_for_outbound_sockets=127.12.158.66
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=33233
--webserver_interface=127.12.158.66
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:32.598335 15133 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:32.603348 15133 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:32.631276 14697 meta_cache.cc:1510] marking tablet server 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349) as failed
W20260502 14:06:32.631369 14697 meta_cache.cc:302] tablet 4710965d239c4545af4272cddced8dfb: replica 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349) has failed: Network error: TS failed: Client connection negotiation failed: client connection to 127.12.158.66:42349: connect: Connection refused (error 111)
W20260502 14:06:32.650610 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.655655 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.668457 15763 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:32.668615 15763 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:32.668649 15763 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:32.670109 15763 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:32.670184 15763 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.66
I20260502 14:06:32.671653 15763 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.66:42349
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.12.158.66
--webserver_port=33233
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.15763
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.66
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:32.671895 15763 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:32.672106 15763 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:32.672705 15763 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:32.674398 15769 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:32.674401 15772 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:32.674505 15770 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:32.674685 15763 server_base.cc:1061] running on GCE node
I20260502 14:06:32.674835 15763 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:32.675042 15763 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:32.676185 15763 hybrid_clock.cc:648] HybridClock initialized: now 1777730792676147 us; error 50 us; skew 500 ppm
I20260502 14:06:32.677147 15763 webserver.cc:492] Webserver started at http://127.12.158.66:33233/ using document root <none> and password file <none>
I20260502 14:06:32.677338 15763 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:32.677397 15763 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:32.678508 15763 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:32.679158 15778 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:32.679316 15763 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20260502 14:06:32.679387 15763 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:32.679616 15763 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260502 14:06:32.702888 15133 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36052: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:32.703513 15763 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:32.703717 15763 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:32.703864 15763 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:32.704049 15763 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:32.704430 15785 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:32.705094 15763 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:32.705149 15763 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:32.705188 15763 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:32.705687 15763 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:32.705737 15763 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:32.705761 15785 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap starting.
I20260502 14:06:32.711800 15763 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.66:42349
I20260502 14:06:32.711861 15892 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.66:42349 every 8 connection(s)
I20260502 14:06:32.712126 15763 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
I20260502 14:06:32.716692 15893 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:32.716799 15893 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:32.716856 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 15763
I20260502 14:06:32.716948 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 15087
I20260502 14:06:32.716969 15893 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:32.720960 14068 ts_manager.cc:194] Re-registered known tserver with Master: 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349)
I20260502 14:06:32.721495 14068 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.66:35019
I20260502 14:06:32.726015 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.65:34909
--local_ip_for_outbound_sockets=127.12.158.65
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=41631
--webserver_interface=127.12.158.65
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:32.728324 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:32.737689 15785 log.cc:826] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Log is configured to *not* fsync() on all Append() calls
W20260502 14:06:32.756803 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.758351 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.785601 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:32.806661 15896 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:32.806802 15896 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:32.806820 15896 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:32.808336 15896 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:32.808384 15896 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.65
I20260502 14:06:32.809900 15896 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.65:34909
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.12.158.65
--webserver_port=41631
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.15896
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.65
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:32.810103 15896 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:32.810330 15896 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:32.810943 15896 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:32.812774 15906 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:32.812824 15904 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:32.812779 15903 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:32.813540 15896 server_base.cc:1061] running on GCE node
I20260502 14:06:32.813691 15896 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:32.813889 15896 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:32.815076 15896 hybrid_clock.cc:648] HybridClock initialized: now 1777730792815059 us; error 31 us; skew 500 ppm
I20260502 14:06:32.816190 15896 webserver.cc:492] Webserver started at http://127.12.158.65:41631/ using document root <none> and password file <none>
I20260502 14:06:32.816376 15896 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:32.816437 15896 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:32.817530 15896 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:32.818182 15912 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:32.818380 15896 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:32.818449 15896 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
uuid: "c9dd32be290b436fb7f776b6f481451b"
format_stamp: "Formatted at 2026-05-02 14:06:21 on dist-test-slave-23m0"
I20260502 14:06:32.818693 15896 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260502 14:06:32.819208 15269 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51864: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:32.836380 15896 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:32.836616 15896 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:32.836736 15896 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:32.836915 15896 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:32.837353 15919 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:32.838269 15896 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:32.838321 15896 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:32.838364 15896 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:32.838944 15896 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:32.838995 15896 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:32.839048 15919 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap starting.
I20260502 14:06:32.846472 15896 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.65:34909
I20260502 14:06:32.846580 16026 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.65:34909 every 8 connection(s)
I20260502 14:06:32.846808 15896 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
I20260502 14:06:32.850770 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 15896
I20260502 14:06:32.850853 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 15224
I20260502 14:06:32.851418 16027 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:32.851538 16027 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:32.851900 16027 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:32.852439 14068 ts_manager.cc:194] Re-registered known tserver with Master: c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909)
I20260502 14:06:32.852818 14068 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.65:59037
I20260502 14:06:32.860128 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.67:44861
--local_ip_for_outbound_sockets=127.12.158.67
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=37231
--webserver_interface=127.12.158.67
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260502 14:06:32.893544 15919 log.cc:826] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Log is configured to *not* fsync() on all Append() calls
W20260502 14:06:32.947263 16030 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:32.947432 16030 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:32.947465 16030 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:32.949045 16030 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:32.949121 16030 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.67
I20260502 14:06:32.950577 16030 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.67:44861
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.12.158.67
--webserver_port=37231
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.16030
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.67
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:32.950788 16030 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:32.950978 16030 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:32.951628 16030 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:32.953363 16037 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:32.953420 16038 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:32.953641 16040 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:32.953830 16030 server_base.cc:1061] running on GCE node
I20260502 14:06:32.953979 16030 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:32.954164 16030 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:32.957381 16030 hybrid_clock.cc:648] HybridClock initialized: now 1777730792957365 us; error 30 us; skew 500 ppm
I20260502 14:06:32.958420 16030 webserver.cc:492] Webserver started at http://127.12.158.67:37231/ using document root <none> and password file <none>
I20260502 14:06:32.958609 16030 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:32.958673 16030 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:32.960001 16030 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:32.960683 16046 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:32.960879 16030 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:32.960951 16030 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
uuid: "fcb69d1fc3094c95bd74e18f784e388d"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:32.961199 16030 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:32.984153 16030 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:32.984413 16030 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:32.984539 16030 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:32.984747 16030 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:32.985186 16053 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:32.985911 16030 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:32.985960 16030 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:32.986006 16030 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:32.986485 16030 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:32.986524 16030 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:32.986629 16053 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap starting.
I20260502 14:06:32.993144 16030 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.67:44861
I20260502 14:06:32.993494 16030 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
I20260502 14:06:32.995874 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 16030
I20260502 14:06:32.994346 16160 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.67:44861 every 8 connection(s)
I20260502 14:06:33.012789 16161 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:33.012893 16161 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:33.013096 16161 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:33.013671 14068 ts_manager.cc:194] Re-registered known tserver with Master: fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:33.014243 14068 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.67:59505
I20260502 14:06:33.033179 16053 log.cc:826] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:33.161988 15827 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:33.162191 15961 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:33.165340 15693 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:33.172291 16076 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:33.580279 15759 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:33.675798 15785 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 1/2 log segments. Stats: ops{read=4856 overwritten=0 applied=4855 ignored=0} inserts{seen=242650 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:06:33.722290 15893 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:33.853525 16027 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:33.898166 15785 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 2/2 log segments. Stats: ops{read=6108 overwritten=0 applied=6107 ignored=0} inserts{seen=305250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:06:33.898670 15785 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap complete.
I20260502 14:06:33.901541 15785 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent bootstrapping tablet: real 1.196s user 0.998s sys 0.168s
I20260502 14:06:33.902406 15785 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 4 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:33.903074 15785 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8b5f3bbadc484c728f336c95a8d8fd78, State: Initialized, Role: FOLLOWER
I20260502 14:06:33.903216 15785 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6107, Last appended: 3.6108, Last appended by leader: 6108, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:33.903461 15785 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent starting tablet: real 0.002s user 0.003s sys 0.001s
W20260502 14:06:33.985235 15807 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46064: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:34.015100 16161 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:34.069229 15919 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 1/2 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
W20260502 14:06:34.144033 15807 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46064: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:34.151010 16199 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 4 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:34.151127 16199 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 4 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:34.151455 16199 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 5 pre-election: Requested pre-vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:34.155521 16108 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 5 candidate_status { last_received { term: 3 index: 6108 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
I20260502 14:06:34.155552 15966 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 5 candidate_status { last_received { term: 3 index: 6108 } } ignore_live_leader: false dest_uuid: "c9dd32be290b436fb7f776b6f481451b" is_pre_election: true
W20260502 14:06:34.156582 15779 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 5 pre-election: Tablet error from VoteRequest() call to peer c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909): Illegal state: must be running to vote when last-logged opid is not known
W20260502 14:06:34.156678 15780 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 5 pre-election: Tablet error from VoteRequest() call to peer fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:06:34.156733 15780 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 5 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78; no voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:06:34.156842 16199 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 4 FOLLOWER]: Leader pre-election lost for term 5. Reason: could not achieve majority
I20260502 14:06:34.196535 16053 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 1/2 log segments. Stats: ops{read=4854 overwritten=0 applied=4853 ignored=0} inserts{seen=242550 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
W20260502 14:06:34.214268 15807 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46064: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:34.323791 15919 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 2/2 log segments. Stats: ops{read=6108 overwritten=0 applied=6107 ignored=0} inserts{seen=305250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:06:34.324229 15919 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap complete.
I20260502 14:06:34.327041 15919 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent bootstrapping tablet: real 1.488s user 1.242s sys 0.207s
I20260502 14:06:34.327945 15919 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 4 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:34.328514 15919 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Initialized, Role: FOLLOWER
I20260502 14:06:34.328652 15919 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6107, Last appended: 3.6108, Last appended by leader: 6108, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:34.328838 15919 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent starting tablet: real 0.002s user 0.005s sys 0.000s
W20260502 14:06:34.370555 15941 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36136: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:34.371978 15807 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46064: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:34.402575 16053 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 2/2 log segments. Stats: ops{read=6107 overwritten=0 applied=6105 ignored=0} inserts{seen=305150 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260502 14:06:34.403061 16053 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap complete.
I20260502 14:06:34.405696 16053 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent bootstrapping tablet: real 1.419s user 1.244s sys 0.164s
I20260502 14:06:34.406193 16053 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 4 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:34.406781 16053 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: fcb69d1fc3094c95bd74e18f784e388d, State: Initialized, Role: FOLLOWER
I20260502 14:06:34.406944 16053 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6105, Last appended: 3.6107, Last appended by leader: 6107, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:34.407186 16053 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent starting tablet: real 0.001s user 0.003s sys 0.000s
W20260502 14:06:34.418561 14695 scanner-internal.cc:458] Time spent opening tablet: real 2.405s user 0.000s sys 0.001s
W20260502 14:06:34.447532 15806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46064: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:34.451086 15941 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36136: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:34.490978 16199 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 4 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:34.491089 16199 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 4 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:34.491250 16199 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 5 pre-election: Requested pre-vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:34.491510 15966 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 5 candidate_status { last_received { term: 3 index: 6108 } } ignore_live_leader: false dest_uuid: "c9dd32be290b436fb7f776b6f481451b" is_pre_election: true
I20260502 14:06:34.491659 15966 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 4 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8b5f3bbadc484c728f336c95a8d8fd78 in term 4.
I20260502 14:06:34.491885 15779 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 5 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78, c9dd32be290b436fb7f776b6f481451b; no voters:
I20260502 14:06:34.491910 16108 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 5 candidate_status { last_received { term: 3 index: 6108 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
I20260502 14:06:34.492048 16108 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 4 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8b5f3bbadc484c728f336c95a8d8fd78 in term 4.
I20260502 14:06:34.492336 16199 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 4 FOLLOWER]: Leader pre-election won for term 5
I20260502 14:06:34.492386 16199 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 4 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:06:34.492412 16199 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 4 FOLLOWER]: Advancing to term 5
I20260502 14:06:34.493769 16199 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 5 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:34.493891 16199 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 5 election: Requested vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:34.494138 16108 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 5 candidate_status { last_received { term: 3 index: 6108 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
I20260502 14:06:34.494202 16108 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 4 FOLLOWER]: Advancing to term 5
I20260502 14:06:34.495306 16108 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 5 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8b5f3bbadc484c728f336c95a8d8fd78 in term 5.
I20260502 14:06:34.495445 15780 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 5 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78, fcb69d1fc3094c95bd74e18f784e388d; no voters:
I20260502 14:06:34.495491 15966 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 5 candidate_status { last_received { term: 3 index: 6108 } } ignore_live_leader: false dest_uuid: "c9dd32be290b436fb7f776b6f481451b"
I20260502 14:06:34.495558 15966 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 4 FOLLOWER]: Advancing to term 5
I20260502 14:06:34.496557 15966 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 5 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8b5f3bbadc484c728f336c95a8d8fd78 in term 5.
I20260502 14:06:34.496722 16199 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 5 FOLLOWER]: Leader election won for term 5
I20260502 14:06:34.496866 16199 raft_consensus.cc:697] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 5 LEADER]: Becoming Leader. State: Replica: 8b5f3bbadc484c728f336c95a8d8fd78, State: Running, Role: LEADER
I20260502 14:06:34.496944 16199 consensus_queue.cc:237] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6107, Committed index: 6107, Last appended: 3.6108, Last appended by leader: 6108, Current term: 5, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:34.497548 14068 catalog_manager.cc:5671] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 reported cstate change: term changed from 3 to 5, leader changed from c9dd32be290b436fb7f776b6f481451b (127.12.158.65) to 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66). New cstate: current_term: 5 leader_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } health_report { overall_health: UNKNOWN } } }
W20260502 14:06:34.525306 16075 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:34.532855 16075 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:34.600447 16108 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 5 FOLLOWER]: Refusing update from remote peer 8b5f3bbadc484c728f336c95a8d8fd78: Log matching property violated. Preceding OpId in replica: term: 3 index: 6107. Preceding OpId from leader: term: 5 index: 6109. (index mismatch)
I20260502 14:06:34.600744 16199 consensus_queue.cc:1048] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [LEADER]: Connected to new peer: Peer: permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6109, Last known committed idx: 6105, Time since last communication: 0.000s
I20260502 14:06:34.603448 15966 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 5 FOLLOWER]: Refusing update from remote peer 8b5f3bbadc484c728f336c95a8d8fd78: Log matching property violated. Preceding OpId in replica: term: 3 index: 6108. Preceding OpId from leader: term: 5 index: 6109. (index mismatch)
I20260502 14:06:34.603826 16211 consensus_queue.cc:1048] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [LEADER]: Connected to new peer: Peer: permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6109, Last known committed idx: 6107, Time since last communication: 0.000s
W20260502 14:06:34.605046 16075 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:35.267117 14696 scanner-internal.cc:458] Time spent opening tablet: real 3.805s user 0.001s sys 0.001s
W20260502 14:06:35.434470 14697 scanner-internal.cc:458] Time spent opening tablet: real 4.006s user 0.001s sys 0.001s
I20260502 14:06:38.427963 15827 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20260502 14:06:38.428932 16076 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:38.431456 15961 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:38.437016 15693 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:38.801257 14068 ts_manager.cc:284] Unset tserver state for bd0a653794c34d9591e2d5c89c802493 from MAINTENANCE_MODE
I20260502 14:06:38.876739 14068 ts_manager.cc:284] Unset tserver state for 8b5f3bbadc484c728f336c95a8d8fd78 from MAINTENANCE_MODE
I20260502 14:06:38.886615 14068 ts_manager.cc:284] Unset tserver state for fcb69d1fc3094c95bd74e18f784e388d from MAINTENANCE_MODE
I20260502 14:06:38.889183 14068 ts_manager.cc:284] Unset tserver state for c9dd32be290b436fb7f776b6f481451b from MAINTENANCE_MODE
I20260502 14:06:39.210084 14068 ts_manager.cc:295] Set tserver state for 8b5f3bbadc484c728f336c95a8d8fd78 to MAINTENANCE_MODE
I20260502 14:06:39.218279 14068 ts_manager.cc:295] Set tserver state for bd0a653794c34d9591e2d5c89c802493 to MAINTENANCE_MODE
I20260502 14:06:39.370604 14068 ts_manager.cc:295] Set tserver state for fcb69d1fc3094c95bd74e18f784e388d to MAINTENANCE_MODE
I20260502 14:06:39.586383 15759 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:39.598637 15827 tablet_service.cc:1460] Tablet server 8b5f3bbadc484c728f336c95a8d8fd78 set to quiescing
I20260502 14:06:39.598722 15827 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20260502 14:06:39.601536 16211 raft_consensus.cc:993] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: : Instructing follower c9dd32be290b436fb7f776b6f481451b to start an election
I20260502 14:06:39.601606 16211 raft_consensus.cc:1081] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 5 LEADER]: Signalling peer c9dd32be290b436fb7f776b6f481451b to start an election
I20260502 14:06:39.601964 15981 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "4710965d239c4545af4272cddced8dfb"
dest_uuid: "c9dd32be290b436fb7f776b6f481451b"
from {username='slave'} at 127.12.158.66:60971
I20260502 14:06:39.602069 15981 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 5 FOLLOWER]: Starting forced leader election (received explicit request)
I20260502 14:06:39.602099 15981 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 5 FOLLOWER]: Advancing to term 6
I20260502 14:06:39.602914 15981 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 6 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:39.603161 15981 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 6 election: Requested vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:39.605130 15966 raft_consensus.cc:1240] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 6 FOLLOWER]: Rejecting Update request from peer 8b5f3bbadc484c728f336c95a8d8fd78 for earlier term 5. Current term is 6. Ops: [5.10658-5.10658]
I20260502 14:06:39.605597 16231 consensus_queue.cc:1059] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 }, Status: INVALID_TERM, Last received: 5.10657, Next index: 10658, Last known committed idx: 10657, Time since last communication: 0.000s
I20260502 14:06:39.605705 16231 raft_consensus.cc:3055] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 5 LEADER]: Stepping down as leader of term 5
I20260502 14:06:39.605732 16231 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 5 LEADER]: Becoming Follower/Learner. State: Replica: 8b5f3bbadc484c728f336c95a8d8fd78, State: Running, Role: LEADER
I20260502 14:06:39.605784 16231 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 10657, Committed index: 10657, Last appended: 5.10660, Last appended by leader: 10660, Current term: 5, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:39.605880 16231 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 5 FOLLOWER]: Advancing to term 6
I20260502 14:06:39.606635 14068 ts_manager.cc:295] Set tserver state for c9dd32be290b436fb7f776b6f481451b to MAINTENANCE_MODE
I20260502 14:06:39.608474 16161 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:39.608945 16027 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:39.612339 15693 tablet_service.cc:1460] Tablet server bd0a653794c34d9591e2d5c89c802493 set to quiescing
I20260502 14:06:39.612396 15693 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:39.627985 15893 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:39.629733 15847 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 6 candidate_status { last_received { term: 5 index: 10657 } } ignore_live_leader: true dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
I20260502 14:06:39.629859 15847 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Leader election vote request: Denying vote to candidate c9dd32be290b436fb7f776b6f481451b for term 6 because replica has last-logged OpId of term: 5 index: 10660, which is greater than that of the candidate, which has last-logged OpId of term: 5 index: 10657.
I20260502 14:06:39.631098 16109 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 6 candidate_status { last_received { term: 5 index: 10657 } } ignore_live_leader: true dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
I20260502 14:06:39.631194 16109 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 5 FOLLOWER]: Advancing to term 6
I20260502 14:06:39.632177 16109 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 6 FOLLOWER]: Leader election vote request: Denying vote to candidate c9dd32be290b436fb7f776b6f481451b for term 6 because replica has last-logged OpId of term: 5 index: 10658, which is greater than that of the candidate, which has last-logged OpId of term: 5 index: 10657.
I20260502 14:06:39.632738 15914 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 6 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: c9dd32be290b436fb7f776b6f481451b; no voters: 8b5f3bbadc484c728f336c95a8d8fd78, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:06:39.633040 16395 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 6 FOLLOWER]: Leader election lost for term 6. Reason: could not achieve majority
I20260502 14:06:39.666026 16076 tablet_service.cc:1460] Tablet server fcb69d1fc3094c95bd74e18f784e388d set to quiescing
I20260502 14:06:39.666090 16076 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:39.793569 15961 tablet_service.cc:1460] Tablet server c9dd32be290b436fb7f776b6f481451b set to quiescing
I20260502 14:06:39.793649 15961 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260502 14:06:39.905165 16416 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:39.956656 16234 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:40.041317 16395 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: failed to trigger leader election: Illegal state: leader elections are disabled
I20260502 14:06:40.740815 15827 tablet_service.cc:1460] Tablet server 8b5f3bbadc484c728f336c95a8d8fd78 set to quiescing
I20260502 14:06:40.740886 15827 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:40.796701 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 15631
I20260502 14:06:40.801659 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.68:37147
--local_ip_for_outbound_sockets=127.12.158.68
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=44081
--webserver_interface=127.12.158.68
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:40.876291 16428 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:40.876463 16428 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:40.876495 16428 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:40.877938 16428 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:40.878016 16428 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.68
I20260502 14:06:40.879484 16428 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.68:37147
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.12.158.68
--webserver_port=44081
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.16428
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.68
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:40.879732 16428 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:40.879968 16428 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:40.880615 16428 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:40.882504 16436 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:40.882558 16428 server_base.cc:1061] running on GCE node
W20260502 14:06:40.882653 16433 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:40.882577 16434 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:40.882946 16428 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:40.883152 16428 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:40.884306 16428 hybrid_clock.cc:648] HybridClock initialized: now 1777730800884292 us; error 28 us; skew 500 ppm
I20260502 14:06:40.885294 16428 webserver.cc:492] Webserver started at http://127.12.158.68:44081/ using document root <none> and password file <none>
I20260502 14:06:40.885483 16428 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:40.885541 16428 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:40.886641 16428 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:40.887177 16442 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:40.887337 16428 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:40.887413 16428 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
uuid: "bd0a653794c34d9591e2d5c89c802493"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:40.887657 16428 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:40.904604 16428 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:40.904847 16428 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:40.904973 16428 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:40.905170 16428 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:40.905483 16428 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:40.905534 16428 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:40.905575 16428 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:40.905613 16428 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:40.911264 16428 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.68:37147
I20260502 14:06:40.911310 16555 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.68:37147 every 8 connection(s)
I20260502 14:06:40.911584 16428 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
I20260502 14:06:40.915499 16556 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:40.915593 16556 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:40.915773 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 16428
I20260502 14:06:40.915812 16556 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:40.915895 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 15763
I20260502 14:06:40.920542 14068 ts_manager.cc:194] Re-registered known tserver with Master: bd0a653794c34d9591e2d5c89c802493 (127.12.158.68:37147)
I20260502 14:06:40.921048 14068 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.68:45073
W20260502 14:06:40.926807 14667 meta_cache.cc:302] tablet 4710965d239c4545af4272cddced8dfb: replica 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349) has failed: Network error: recv got EOF from 127.12.158.66:42349 (error 108)
I20260502 14:06:40.927248 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.66:42349
--local_ip_for_outbound_sockets=127.12.158.66
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=33233
--webserver_interface=127.12.158.66
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:40.929415 15941 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36136: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:40.931277 15941 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36136: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:40.931370 15940 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36136: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:40.935096 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:40.935108 16075 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:40.947481 15940 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36136: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:40.949601 15940 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36136: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:40.955214 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:40.957311 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:40.972712 15940 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36136: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:40.975841 15940 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36136: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:40.983404 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:40.985566 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:41.004969 16560 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:41.005106 16560 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:41.005127 16560 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:41.006564 16560 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:41.006613 16560 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.66
I20260502 14:06:41.008164 16560 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.66:42349
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.12.158.66
--webserver_port=33233
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.16560
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.66
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:41.008358 16560 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:41.008570 16560 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:41.009164 16560 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:41.009902 15940 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36136: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:41.010914 16566 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:41.010936 16567 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:41.010936 16569 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:41.011477 16560 server_base.cc:1061] running on GCE node
I20260502 14:06:41.011653 16560 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:41.011900 16560 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
W20260502 14:06:41.012869 15940 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36136: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:41.013077 16560 hybrid_clock.cc:648] HybridClock initialized: now 1777730801013060 us; error 21 us; skew 500 ppm
I20260502 14:06:41.014145 16560 webserver.cc:492] Webserver started at http://127.12.158.66:33233/ using document root <none> and password file <none>
I20260502 14:06:41.014335 16560 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:41.014386 16560 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:41.015561 16560 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:41.016238 16575 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:41.016438 16560 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:41.016526 16560 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:41.016773 16560 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260502 14:06:41.025432 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:41.027493 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:41.032840 16560 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:41.033124 16560 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:41.033267 16560 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:41.033489 16560 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:41.033880 16582 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:41.034523 16560 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:41.034569 16560 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:41.034591 16560 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:41.035126 16560 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:41.035181 16560 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:41.035205 16582 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap starting.
I20260502 14:06:41.042065 16560 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.66:42349
I20260502 14:06:41.042110 16689 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.66:42349 every 8 connection(s)
I20260502 14:06:41.042395 16560 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
I20260502 14:06:41.047194 16690 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:41.047268 16690 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:41.047566 16690 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:41.048105 14069 ts_manager.cc:194] Re-registered known tserver with Master: 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349)
I20260502 14:06:41.048652 14069 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.66:52257
I20260502 14:06:41.051914 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 16560
I20260502 14:06:41.052053 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 15896
W20260502 14:06:41.061465 14666 connection.cc:570] client connection to 127.12.158.65:34909 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20260502 14:06:41.061928 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.65:34909
--local_ip_for_outbound_sockets=127.12.158.65
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=41631
--webserver_interface=127.12.158.65
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260502 14:06:41.071969 16582 log.cc:826] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Log is configured to *not* fsync() on all Append() calls
W20260502 14:06:41.076215 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:41.077260 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:41.099604 14695 meta_cache.cc:1510] marking tablet server c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909) as failed
W20260502 14:06:41.132833 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:41.135090 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:41.141808 16693 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:41.141949 16693 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:41.141969 16693 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:41.143424 16693 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:41.143472 16693 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.65
I20260502 14:06:41.145025 16693 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.65:34909
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.12.158.65
--webserver_port=41631
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.16693
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.65
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:41.145346 16693 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:41.145684 16693 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:41.146409 16693 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:41.148164 16701 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:41.148219 16703 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:41.148180 16700 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:41.148793 16693 server_base.cc:1061] running on GCE node
I20260502 14:06:41.148967 16693 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:41.149181 16693 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:41.150331 16693 hybrid_clock.cc:648] HybridClock initialized: now 1777730801150317 us; error 31 us; skew 500 ppm
I20260502 14:06:41.151377 16693 webserver.cc:492] Webserver started at http://127.12.158.65:41631/ using document root <none> and password file <none>
I20260502 14:06:41.151564 16693 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:41.151624 16693 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:41.152823 16693 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:41.153489 16709 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:41.153697 16693 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:41.153846 16693 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
uuid: "c9dd32be290b436fb7f776b6f481451b"
format_stamp: "Formatted at 2026-05-02 14:06:21 on dist-test-slave-23m0"
I20260502 14:06:41.154134 16693 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260502 14:06:41.175985 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:41.179257 16693 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:41.179517 16693 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:41.179643 16693 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:41.179926 16693 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:41.180388 16716 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:41.181090 16693 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:41.181141 16693 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:41.181173 16693 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:41.181742 16693 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:41.181782 16693 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:41.181813 16716 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap starting.
I20260502 14:06:41.189003 16693 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.65:34909
I20260502 14:06:41.189065 16823 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.65:34909 every 8 connection(s)
I20260502 14:06:41.189429 16693 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
I20260502 14:06:41.194856 16824 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:41.194945 16824 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:41.195123 16824 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
W20260502 14:06:41.195320 16074 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51904: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:41.195739 14069 ts_manager.cc:194] Re-registered known tserver with Master: c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909)
I20260502 14:06:41.196264 14069 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.65:40929
I20260502 14:06:41.196508 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 16693
I20260502 14:06:41.196602 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 16030
I20260502 14:06:41.209444 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.67:44861
--local_ip_for_outbound_sockets=127.12.158.67
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=37231
--webserver_interface=127.12.158.67
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260502 14:06:41.221670 16716 log.cc:826] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Log is configured to *not* fsync() on all Append() calls
W20260502 14:06:41.325809 16827 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:41.326117 16827 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:41.326197 16827 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:41.328648 16827 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:41.328727 16827 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.67
I20260502 14:06:41.331137 16827 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.67:44861
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.12.158.67
--webserver_port=37231
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.16827
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.67
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:41.331367 16827 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:41.331662 16827 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:41.332543 16827 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:41.334484 16835 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:41.334599 16834 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:41.334468 16837 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:41.335213 16827 server_base.cc:1061] running on GCE node
I20260502 14:06:41.335392 16827 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:41.335601 16827 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:41.336720 16827 hybrid_clock.cc:648] HybridClock initialized: now 1777730801336676 us; error 56 us; skew 500 ppm
I20260502 14:06:41.338003 16827 webserver.cc:492] Webserver started at http://127.12.158.67:37231/ using document root <none> and password file <none>
I20260502 14:06:41.338284 16827 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:41.338359 16827 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:41.339910 16827 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:41.340792 16843 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:41.340953 16827 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:41.341014 16827 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
uuid: "fcb69d1fc3094c95bd74e18f784e388d"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:41.341324 16827 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:41.409641 16827 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:41.409955 16827 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:41.410140 16827 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:41.410389 16827 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:41.410969 16850 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:41.411955 16827 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:41.412011 16827 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:41.412045 16827 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:41.412787 16827 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:41.412860 16827 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:41.412941 16850 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap starting.
I20260502 14:06:41.419528 16827 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.67:44861
I20260502 14:06:41.419988 16827 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
I20260502 14:06:41.425863 16957 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.67:44861 every 8 connection(s)
I20260502 14:06:41.426445 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 16827
I20260502 14:06:41.434777 16958 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:41.434862 16958 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:41.435041 16958 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:41.435616 14069 ts_manager.cc:194] Re-registered known tserver with Master: fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:41.436132 14069 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.67:40493
I20260502 14:06:41.465945 16850 log.cc:826] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:41.601450 16624 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:41.620250 16490 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:41.621343 16758 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:41.630021 16892 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:41.921821 16556 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:41.995736 16582 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 1/3 log segments. Stats: ops{read=4622 overwritten=0 applied=4621 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:06:42.049348 16690 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:42.204746 16824 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:42.413631 16716 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 1/3 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260502 14:06:42.437042 16958 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:42.667971 16850 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 1/3 log segments. Stats: ops{read=4622 overwritten=0 applied=4622 ignored=0} inserts{seen=231000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260502 14:06:42.812588 16582 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 2/3 log segments. Stats: ops{read=9244 overwritten=0 applied=9241 ignored=0} inserts{seen=461900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260502 14:06:43.066155 16582 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 3/3 log segments. Stats: ops{read=10660 overwritten=0 applied=10657 ignored=0} inserts{seen=532700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260502 14:06:43.066632 16582 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap complete.
I20260502 14:06:43.071070 16582 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent bootstrapping tablet: real 2.036s user 1.715s sys 0.307s
I20260502 14:06:43.072166 16582 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:43.072827 16582 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8b5f3bbadc484c728f336c95a8d8fd78, State: Initialized, Role: FOLLOWER
I20260502 14:06:43.073045 16582 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 10657, Last appended: 5.10660, Last appended by leader: 10660, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:43.073305 16582 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent starting tablet: real 0.002s user 0.005s sys 0.000s
W20260502 14:06:43.102177 14695 scanner-internal.cc:458] Time spent opening tablet: real 2.405s user 0.001s sys 0.001s
I20260502 14:06:43.306152 16998 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:43.306264 16998 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:43.306533 16998 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 pre-election: Requested pre-vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:43.315698 16912 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 7 candidate_status { last_received { term: 5 index: 10660 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
W20260502 14:06:43.316864 16577 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 pre-election: Tablet error from VoteRequest() call to peer fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:06:43.327231 16778 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 7 candidate_status { last_received { term: 5 index: 10660 } } ignore_live_leader: false dest_uuid: "c9dd32be290b436fb7f776b6f481451b" is_pre_election: true
W20260502 14:06:43.328390 16576 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 pre-election: Tablet error from VoteRequest() call to peer c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:06:43.328485 16576 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78; no voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:06:43.328619 16998 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Leader pre-election lost for term 7. Reason: could not achieve majority
I20260502 14:06:43.650239 16998 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:43.650357 16998 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:43.650511 16998 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 pre-election: Requested pre-vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:43.650663 16778 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 7 candidate_status { last_received { term: 5 index: 10660 } } ignore_live_leader: false dest_uuid: "c9dd32be290b436fb7f776b6f481451b" is_pre_election: true
I20260502 14:06:43.650715 16912 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 7 candidate_status { last_received { term: 5 index: 10660 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
W20260502 14:06:43.650852 16576 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 pre-election: Tablet error from VoteRequest() call to peer c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909): Illegal state: must be running to vote when last-logged opid is not known
W20260502 14:06:43.650902 16577 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 pre-election: Tablet error from VoteRequest() call to peer fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:06:43.650935 16577 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78; no voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:06:43.651023 16998 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Leader pre-election lost for term 7. Reason: could not achieve majority
I20260502 14:06:43.744976 16716 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 2/3 log segments. Stats: ops{read=9399 overwritten=0 applied=9397 ignored=0} inserts{seen=469700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260502 14:06:43.960206 16850 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 2/3 log segments. Stats: ops{read=9375 overwritten=0 applied=9372 ignored=0} inserts{seen=468450 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260502 14:06:44.098685 16716 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 3/3 log segments. Stats: ops{read=10657 overwritten=0 applied=10657 ignored=0} inserts{seen=532700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260502 14:06:44.099251 16716 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap complete.
I20260502 14:06:44.105369 16716 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent bootstrapping tablet: real 2.924s user 2.502s sys 0.411s
I20260502 14:06:44.106096 16716 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 6 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:44.106433 16716 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 6 FOLLOWER]: Becoming Follower/Learner. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Initialized, Role: FOLLOWER
I20260502 14:06:44.106577 16716 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 10657, Last appended: 5.10657, Last appended by leader: 10657, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:44.106832 16716 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent starting tablet: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:44.127040 16998 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:44.127142 16998 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:44.127331 16998 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 pre-election: Requested pre-vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:44.127487 16778 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 7 candidate_status { last_received { term: 5 index: 10660 } } ignore_live_leader: false dest_uuid: "c9dd32be290b436fb7f776b6f481451b" is_pre_election: true
I20260502 14:06:44.127583 16912 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 7 candidate_status { last_received { term: 5 index: 10660 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
I20260502 14:06:44.127617 16778 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 6 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8b5f3bbadc484c728f336c95a8d8fd78 in term 6.
I20260502 14:06:44.127792 16576 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78, c9dd32be290b436fb7f776b6f481451b; no voters:
W20260502 14:06:44.127839 16577 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 pre-election: Tablet error from VoteRequest() call to peer fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:06:44.127964 16998 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Leader pre-election won for term 7
I20260502 14:06:44.128000 16998 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:06:44.128019 16998 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 6 FOLLOWER]: Advancing to term 7
I20260502 14:06:44.129444 16998 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 7 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:44.129573 16998 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 election: Requested vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:44.129740 16912 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 7 candidate_status { last_received { term: 5 index: 10660 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
I20260502 14:06:44.129739 16778 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 7 candidate_status { last_received { term: 5 index: 10660 } } ignore_live_leader: false dest_uuid: "c9dd32be290b436fb7f776b6f481451b"
I20260502 14:06:44.129827 16778 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 6 FOLLOWER]: Advancing to term 7
W20260502 14:06:44.130049 16577 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 election: Tablet error from VoteRequest() call to peer fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:06:44.131028 16778 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 7 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8b5f3bbadc484c728f336c95a8d8fd78 in term 7.
I20260502 14:06:44.131186 16576 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 7 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78, c9dd32be290b436fb7f776b6f481451b; no voters: fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:06:44.131310 16998 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 7 FOLLOWER]: Leader election won for term 7
I20260502 14:06:44.131423 16998 raft_consensus.cc:697] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 7 LEADER]: Becoming Leader. State: Replica: 8b5f3bbadc484c728f336c95a8d8fd78, State: Running, Role: LEADER
I20260502 14:06:44.131516 16998 consensus_queue.cc:237] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 10657, Committed index: 10657, Last appended: 5.10660, Last appended by leader: 10660, Current term: 7, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:44.132170 14069 catalog_manager.cc:5671] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 reported cstate change: term changed from 5 to 7. New cstate: current_term: 7 leader_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } health_report { overall_health: UNKNOWN } } }
W20260502 14:06:44.209599 16577 consensus_peers.cc:597] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 -> Peer fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861): Couldn't send request to peer fcb69d1fc3094c95bd74e18f784e388d. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 1: this message will repeat every 5th retry.
I20260502 14:06:44.243146 16778 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 7 FOLLOWER]: Refusing update from remote peer 8b5f3bbadc484c728f336c95a8d8fd78: Log matching property violated. Preceding OpId in replica: term: 5 index: 10657. Preceding OpId from leader: term: 7 index: 10661. (index mismatch)
I20260502 14:06:44.243677 17007 consensus_queue.cc:1048] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [LEADER]: Connected to new peer: Peer: permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 10661, Last known committed idx: 10657, Time since last communication: 0.000s
I20260502 14:06:44.246984 17017 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:37766 (ReqId={client: 9a92a56a48e84ce69da2010ab7b71142, seq_no=10655, attempt_no=62}) took 1152 ms. Trace:
I20260502 14:06:44.247133 17017 rpcz_store.cc:276] 0502 14:06:43.094737 (+ 0us) service_pool.cc:167] Inserting onto call queue
0502 14:06:43.094763 (+ 26us) service_pool.cc:224] Handling call
0502 14:06:44.246977 (+1152214us) inbound_call.cc:177] Queueing success response
Metrics: {}
I20260502 14:06:44.247327 17015 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:37766 (ReqId={client: 9a92a56a48e84ce69da2010ab7b71142, seq_no=10656, attempt_no=63}) took 1098 ms. Trace:
I20260502 14:06:44.247345 17016 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:37766 (ReqId={client: 9a92a56a48e84ce69da2010ab7b71142, seq_no=10654, attempt_no=59}) took 1110 ms. Trace:
I20260502 14:06:44.247673 17016 rpcz_store.cc:276] 0502 14:06:43.136410 (+ 0us) service_pool.cc:167] Inserting onto call queue
0502 14:06:43.136657 (+ 247us) service_pool.cc:224] Handling call
0502 14:06:44.247341 (+1110684us) inbound_call.cc:177] Queueing success response
Metrics: {}
I20260502 14:06:44.247867 17015 rpcz_store.cc:276] 0502 14:06:43.148466 (+ 0us) service_pool.cc:167] Inserting onto call queue
0502 14:06:43.148490 (+ 24us) service_pool.cc:224] Handling call
0502 14:06:44.247314 (+1098824us) inbound_call.cc:177] Queueing success response
Metrics: {}
I20260502 14:06:44.314848 16850 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 3/3 log segments. Stats: ops{read=10658 overwritten=0 applied=10657 ignored=0} inserts{seen=532700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:06:44.315440 16850 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap complete.
I20260502 14:06:44.321544 16850 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent bootstrapping tablet: real 2.909s user 2.516s sys 0.363s
I20260502 14:06:44.322657 16850 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 6 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:44.323486 16850 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 6 FOLLOWER]: Becoming Follower/Learner. State: Replica: fcb69d1fc3094c95bd74e18f784e388d, State: Initialized, Role: FOLLOWER
I20260502 14:06:44.323639 16850 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 10657, Last appended: 5.10658, Last appended by leader: 10658, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:44.323879 16850 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent starting tablet: real 0.002s user 0.003s sys 0.001s
W20260502 14:06:44.382424 14696 scanner-internal.cc:458] Time spent opening tablet: real 4.006s user 0.001s sys 0.001s
I20260502 14:06:44.398604 16912 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 6 FOLLOWER]: Advancing to term 7
I20260502 14:06:44.400465 16912 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 7 FOLLOWER]: Refusing update from remote peer 8b5f3bbadc484c728f336c95a8d8fd78: Log matching property violated. Preceding OpId in replica: term: 5 index: 10658. Preceding OpId from leader: term: 5 index: 10660. (index mismatch)
I20260502 14:06:44.400995 17027 consensus_queue.cc:1050] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [LEADER]: Got LMP mismatch error from peer: Peer: permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 10661, Last known committed idx: 10657, Time since last communication: 0.000s
I20260502 14:06:44.418231 17025 mvcc.cc:204] Tried to move back new op lower bound from 7281585374820880384 to 7281585373723021312. Current Snapshot: MvccSnapshot[applied={T|T < 7281585374490927104 or (T in {7281585374491381760,7281585374501195776,7281585374501904384,7281585374502518784,7281585374513250304,7281585374508531712,7281585374514827264,7281585374522241024,7281585374519357440,7281585374522662912,7281585374529249280,7281585374530105344,7281585374539374592,7281585374530551808,7281585374540242944,7281585374546866176,7281585374540845056,7281585374549618688,7281585374551711744,7281585374553718784,7281585374561304576,7281585374558556160,7281585374562013184,7281585374567596032,7281585374565752832,7281585374574047232,7281585374575861760,7281585374574768128,7281585374583222272,7281585374585352192,7281585374586327040,7281585374592851968,7281585374593523712,7281585374593949696,7281585374602264576,7281585374599704576,7281585374604632064,7281585374610276352,7281585374613991424,7281585374619914240,7281585374615183360,7281585374623444992,7281585374630060032,7281585374626205696,7281585374630998016,7281585374637527040,7281585374634180608})}]
W20260502 14:06:44.453068 14697 scanner-internal.cc:458] Time spent opening tablet: real 4.006s user 0.001s sys 0.001s
I20260502 14:06:46.892979 16624 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20260502 14:06:46.907272 16758 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:46.908046 16892 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:46.917641 16490 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:47.208850 14069 ts_manager.cc:284] Unset tserver state for c9dd32be290b436fb7f776b6f481451b from MAINTENANCE_MODE
I20260502 14:06:47.226502 14069 ts_manager.cc:284] Unset tserver state for 8b5f3bbadc484c728f336c95a8d8fd78 from MAINTENANCE_MODE
I20260502 14:06:47.249498 16824 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:47.340968 14069 ts_manager.cc:284] Unset tserver state for fcb69d1fc3094c95bd74e18f784e388d from MAINTENANCE_MODE
I20260502 14:06:47.354862 14069 ts_manager.cc:284] Unset tserver state for bd0a653794c34d9591e2d5c89c802493 from MAINTENANCE_MODE
I20260502 14:06:47.410073 16958 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:47.413044 16690 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:47.707887 14069 ts_manager.cc:295] Set tserver state for fcb69d1fc3094c95bd74e18f784e388d to MAINTENANCE_MODE
I20260502 14:06:47.735056 14069 ts_manager.cc:295] Set tserver state for 8b5f3bbadc484c728f336c95a8d8fd78 to MAINTENANCE_MODE
I20260502 14:06:47.747011 14069 ts_manager.cc:295] Set tserver state for c9dd32be290b436fb7f776b6f481451b to MAINTENANCE_MODE
I20260502 14:06:47.762818 14069 ts_manager.cc:295] Set tserver state for bd0a653794c34d9591e2d5c89c802493 to MAINTENANCE_MODE
I20260502 14:06:47.925839 16556 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:48.117585 16490 tablet_service.cc:1460] Tablet server bd0a653794c34d9591e2d5c89c802493 set to quiescing
I20260502 14:06:48.117654 16490 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:48.211619 16624 tablet_service.cc:1460] Tablet server 8b5f3bbadc484c728f336c95a8d8fd78 set to quiescing
I20260502 14:06:48.211687 16624 tablet_service.cc:1467] Tablet server has 1 leaders and 2 scanners
I20260502 14:06:48.213567 17023 raft_consensus.cc:993] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: : Instructing follower fcb69d1fc3094c95bd74e18f784e388d to start an election
I20260502 14:06:48.213639 17023 raft_consensus.cc:1081] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 7 LEADER]: Signalling peer fcb69d1fc3094c95bd74e18f784e388d to start an election
I20260502 14:06:48.215358 16911 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "4710965d239c4545af4272cddced8dfb"
dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
from {username='slave'} at 127.12.158.66:57373
I20260502 14:06:48.215456 16911 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 7 FOLLOWER]: Starting forced leader election (received explicit request)
I20260502 14:06:48.215514 16911 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 7 FOLLOWER]: Advancing to term 8
I20260502 14:06:48.216382 16911 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 8 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:48.216635 16911 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [CANDIDATE]: Term 8 election: Requested vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909)
I20260502 14:06:48.216739 17080 raft_consensus.cc:993] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: : Instructing follower c9dd32be290b436fb7f776b6f481451b to start an election
I20260502 14:06:48.216778 17080 raft_consensus.cc:1081] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 7 LEADER]: Signalling peer c9dd32be290b436fb7f776b6f481451b to start an election
I20260502 14:06:48.217173 16912 raft_consensus.cc:1240] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 8 FOLLOWER]: Rejecting Update request from peer 8b5f3bbadc484c728f336c95a8d8fd78 for earlier term 7. Current term is 8. Ops: []
I20260502 14:06:48.217558 17023 consensus_queue.cc:1059] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 }, Status: INVALID_TERM, Last received: 7.14059, Next index: 14060, Last known committed idx: 14056, Time since last communication: 0.000s
I20260502 14:06:48.217995 17023 raft_consensus.cc:3055] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 7 LEADER]: Stepping down as leader of term 7
I20260502 14:06:48.218037 17023 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 7 LEADER]: Becoming Follower/Learner. State: Replica: 8b5f3bbadc484c728f336c95a8d8fd78, State: Running, Role: LEADER
I20260502 14:06:48.218084 17023 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 14060, Committed index: 14060, Last appended: 7.14061, Last appended by leader: 14061, Current term: 7, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:48.218149 17023 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 7 FOLLOWER]: Advancing to term 8
W20260502 14:06:48.219180 16603 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.219192 17018 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: Cannot assign timestamp to op. Tablet is not in leader mode. Last heard from a leader: 0.002s ago.
I20260502 14:06:48.219545 16912 raft_consensus.cc:1240] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 8 FOLLOWER]: Rejecting Update request from peer 8b5f3bbadc484c728f336c95a8d8fd78 for earlier term 7. Current term is 8. Ops: [7.14060-7.14061]
I20260502 14:06:48.219856 16778 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "4710965d239c4545af4272cddced8dfb"
dest_uuid: "c9dd32be290b436fb7f776b6f481451b"
from {username='slave'} at 127.12.158.66:44183
I20260502 14:06:48.219947 16778 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 7 FOLLOWER]: Starting forced leader election (received explicit request)
I20260502 14:06:48.219980 16778 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 7 FOLLOWER]: Advancing to term 8
I20260502 14:06:48.220791 16778 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:48.220929 16892 tablet_service.cc:1460] Tablet server fcb69d1fc3094c95bd74e18f784e388d set to quiescing
I20260502 14:06:48.220973 16892 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:48.221006 16778 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 8 election: Requested vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
W20260502 14:06:48.224752 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.225126 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:48.226477 16644 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "fcb69d1fc3094c95bd74e18f784e388d" candidate_term: 8 candidate_status { last_received { term: 7 index: 14059 } } ignore_live_leader: true dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
I20260502 14:06:48.226585 16644 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 8 FOLLOWER]: Leader election vote request: Denying vote to candidate fcb69d1fc3094c95bd74e18f784e388d for term 8 because replica has last-logged OpId of term: 7 index: 14061, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 14059.
I20260502 14:06:48.227502 16778 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "fcb69d1fc3094c95bd74e18f784e388d" candidate_term: 8 candidate_status { last_received { term: 7 index: 14059 } } ignore_live_leader: true dest_uuid: "c9dd32be290b436fb7f776b6f481451b"
I20260502 14:06:48.227591 16778 raft_consensus.cc:2393] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Leader election vote request: Denying vote to candidate fcb69d1fc3094c95bd74e18f784e388d in current term 8: Already voted for candidate c9dd32be290b436fb7f776b6f481451b in this term.
I20260502 14:06:48.228514 16844 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [CANDIDATE]: Term 8 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: fcb69d1fc3094c95bd74e18f784e388d; no voters: 8b5f3bbadc484c728f336c95a8d8fd78, c9dd32be290b436fb7f776b6f481451b
W20260502 14:06:48.228675 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.229650 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:48.229882 16644 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 8 candidate_status { last_received { term: 7 index: 14060 } } ignore_live_leader: true dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
I20260502 14:06:48.229966 16644 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 8 FOLLOWER]: Leader election vote request: Denying vote to candidate c9dd32be290b436fb7f776b6f481451b for term 8 because replica has last-logged OpId of term: 7 index: 14061, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 14060.
I20260502 14:06:48.230664 16912 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 8 candidate_status { last_received { term: 7 index: 14060 } } ignore_live_leader: true dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
I20260502 14:06:48.230749 16912 raft_consensus.cc:2393] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 8 FOLLOWER]: Leader election vote request: Denying vote to candidate c9dd32be290b436fb7f776b6f481451b in current term 8: Already voted for candidate fcb69d1fc3094c95bd74e18f784e388d in this term.
I20260502 14:06:48.230957 16711 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 8 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: c9dd32be290b436fb7f776b6f481451b; no voters: 8b5f3bbadc484c728f336c95a8d8fd78, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:06:48.231200 17215 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Leader election lost for term 8. Reason: could not achieve majority
W20260502 14:06:48.234673 16603 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.234813 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:48.235917 17214 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 8 FOLLOWER]: Leader election lost for term 8. Reason: could not achieve majority
W20260502 14:06:48.240372 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.242300 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:48.243152 16758 tablet_service.cc:1460] Tablet server c9dd32be290b436fb7f776b6f481451b set to quiescing
I20260502 14:06:48.243204 16758 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260502 14:06:48.248803 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.248961 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:48.250715 16824 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
W20260502 14:06:48.256773 16601 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.258728 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.266182 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.267345 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.276029 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.277020 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.287209 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.289151 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.302848 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.303901 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.315518 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.318553 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.329679 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.334769 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.344841 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.350003 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.361665 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.368753 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.377823 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.384827 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.397500 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.405668 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.420063 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.424525 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.439671 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.446969 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.461658 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.467801 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.482656 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.488646 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.509896 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.510804 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.535584 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.536495 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.560151 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.560151 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.573311 17023 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:48.582989 17215 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:48.585352 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.588457 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.613304 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.616248 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.641973 17214 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:48.642894 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.646097 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.675704 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.677973 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.705838 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.706812 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.737495 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.739646 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.770057 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.772060 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.803895 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.804760 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.837411 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.839468 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.873848 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.876937 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.909741 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.913813 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:48.949505 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.951550 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:48.990983 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:48.991819 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:49.031733 16737 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.031725 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.072708 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.074640 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.117031 16601 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:49.117031 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:49.158919 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.159905 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.202653 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.205724 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.249127 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:49.250108 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:49.294123 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.297150 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.341964 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.341964 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:49.357345 16624 tablet_service.cc:1460] Tablet server 8b5f3bbadc484c728f336c95a8d8fd78 set to quiescing
I20260502 14:06:49.357405 16624 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260502 14:06:49.391791 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:49.392263 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:49.413930 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 16428
I20260502 14:06:49.419909 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.68:37147
--local_ip_for_outbound_sockets=127.12.158.68
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=44081
--webserver_interface=127.12.158.68
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:49.440181 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.443198 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.490202 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.492169 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.498160 17233 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:49.498312 17233 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:49.498332 17233 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:49.499868 17233 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:49.499919 17233 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.68
I20260502 14:06:49.501470 17233 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.68:37147
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.12.158.68
--webserver_port=44081
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.17233
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.68
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:49.501688 17233 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:49.501932 17233 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:49.502593 17233 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:49.504742 17239 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:49.504769 17238 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:49.504899 17241 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:49.504871 17233 server_base.cc:1061] running on GCE node
I20260502 14:06:49.505203 17233 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:49.505419 17233 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:49.506549 17233 hybrid_clock.cc:648] HybridClock initialized: now 1777730809506526 us; error 43 us; skew 500 ppm
I20260502 14:06:49.507658 17233 webserver.cc:492] Webserver started at http://127.12.158.68:44081/ using document root <none> and password file <none>
I20260502 14:06:49.507901 17233 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:49.507951 17233 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:49.509104 17233 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:49.509711 17247 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:49.509877 17233 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20260502 14:06:49.510044 17233 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
uuid: "bd0a653794c34d9591e2d5c89c802493"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:49.510438 17233 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:49.527509 17233 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:49.527840 17233 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:49.527992 17233 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:49.528209 17233 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:49.528537 17233 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:49.528591 17233 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:49.528636 17233 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:49.528667 17233 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:49.534375 17233 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.68:37147
I20260502 14:06:49.534403 17360 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.68:37147 every 8 connection(s)
I20260502 14:06:49.534737 17233 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
I20260502 14:06:49.539459 17361 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:49.539542 17361 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:49.539724 17361 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:49.540158 14069 ts_manager.cc:194] Re-registered known tserver with Master: bd0a653794c34d9591e2d5c89c802493 (127.12.158.68:37147)
I20260502 14:06:49.540586 14069 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.68:42957
W20260502 14:06:49.541694 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:49.542320 16600 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37766: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:49.544597 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 17233
I20260502 14:06:49.544687 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 16560
W20260502 14:06:49.555990 14667 meta_cache.cc:302] tablet 4710965d239c4545af4272cddced8dfb: replica 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349) has failed: Network error: recv got EOF from 127.12.158.66:42349 (error 108)
I20260502 14:06:49.556344 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.66:42349
--local_ip_for_outbound_sockets=127.12.158.66
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=33233
--webserver_interface=127.12.158.66
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:49.560782 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.563555 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.575343 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.584048 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.595618 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.596710 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.603866 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:49.615962 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:49.626915 14697 meta_cache.cc:1510] marking tablet server 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349) as failed
W20260502 14:06:49.636243 17365 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:49.636415 17365 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:49.636435 17365 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:49.637921 17365 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:49.637972 17365 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.66
I20260502 14:06:49.639612 17365 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.66:42349
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.12.158.66
--webserver_port=33233
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.17365
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.66
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:49.639840 17365 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:49.640053 17365 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:49.640691 17365 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:49.642510 17374 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:49.642514 17372 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:49.642514 17371 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:49.642879 17365 server_base.cc:1061] running on GCE node
I20260502 14:06:49.643043 17365 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:49.643267 17365 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:49.644413 17365 hybrid_clock.cc:648] HybridClock initialized: now 1777730809644394 us; error 25 us; skew 500 ppm
I20260502 14:06:49.645494 17365 webserver.cc:492] Webserver started at http://127.12.158.66:33233/ using document root <none> and password file <none>
I20260502 14:06:49.645705 17365 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:49.645774 17365 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:49.646929 17365 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:49.647667 17380 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:49.647886 17365 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:49.647962 17365 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:49.648224 17365 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260502 14:06:49.649827 16870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.649830 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.654673 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:49.666406 17365 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:49.666743 17365 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:49.666884 17365 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:49.667109 17365 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
W20260502 14:06:49.667552 16738 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37274: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:49.667716 17387 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:49.668493 17365 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:49.668552 17365 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:49.668588 17365 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:49.669119 17365 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:49.669173 17365 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:49.669221 17387 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap starting.
I20260502 14:06:49.675700 17365 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.66:42349
I20260502 14:06:49.675769 17494 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.66:42349 every 8 connection(s)
I20260502 14:06:49.676046 17365 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
I20260502 14:06:49.680938 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 17365
I20260502 14:06:49.681049 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 16693
I20260502 14:06:49.684710 17495 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:49.684835 17495 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:49.685030 17495 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:49.685052 17387 log.cc:826] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:49.685608 14069 ts_manager.cc:194] Re-registered known tserver with Master: 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349)
I20260502 14:06:49.686000 14069 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.66:36581
I20260502 14:06:49.694537 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.65:34909
--local_ip_for_outbound_sockets=127.12.158.65
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=41631
--webserver_interface=127.12.158.65
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:49.728973 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.759131 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.766294 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:49.774238 17500 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:49.774415 17500 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:49.774451 17500 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:49.775985 17500 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:49.776058 17500 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.65
I20260502 14:06:49.777554 17500 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.65:34909
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.12.158.65
--webserver_port=41631
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.17500
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.65
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:49.777783 17500 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:49.777998 17500 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:49.778697 17500 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:49.780539 17506 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:49.780543 17508 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:49.780565 17505 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:49.780974 17500 server_base.cc:1061] running on GCE node
I20260502 14:06:49.781138 17500 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:49.781370 17500 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:49.782516 17500 hybrid_clock.cc:648] HybridClock initialized: now 1777730809782494 us; error 35 us; skew 500 ppm
I20260502 14:06:49.783607 17500 webserver.cc:492] Webserver started at http://127.12.158.65:41631/ using document root <none> and password file <none>
I20260502 14:06:49.783872 17500 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:49.783939 17500 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:49.785079 17500 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:49.785770 17514 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:49.786011 17500 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:49.786091 17500 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
uuid: "c9dd32be290b436fb7f776b6f481451b"
format_stamp: "Formatted at 2026-05-02 14:06:21 on dist-test-slave-23m0"
I20260502 14:06:49.786401 17500 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:49.808364 17500 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:49.808662 17500 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:49.808919 17500 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:49.809145 17500 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:49.809670 17521 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:49.810470 17500 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:49.810524 17500 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:49.810563 17500 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:49.811214 17500 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:49.811256 17500 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:49.811308 17521 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap starting.
W20260502 14:06:49.814352 16871 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47658: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:49.818732 17500 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.65:34909
I20260502 14:06:49.818863 17628 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.65:34909 every 8 connection(s)
I20260502 14:06:49.819099 17500 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
I20260502 14:06:49.822131 17521 log.cc:826] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:49.824419 17629 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:49.824502 17629 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:49.824679 17629 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:49.825184 14069 ts_manager.cc:194] Re-registered known tserver with Master: c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909)
I20260502 14:06:49.825696 14069 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.65:33395
I20260502 14:06:49.829162 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 17500
I20260502 14:06:49.829280 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 16827
I20260502 14:06:49.845911 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.67:44861
--local_ip_for_outbound_sockets=127.12.158.67
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=37231
--webserver_interface=127.12.158.67
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:49.960572 17634 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:49.960793 17634 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:49.960824 17634 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:49.963172 17634 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:49.963248 17634 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.67
I20260502 14:06:49.965655 17634 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.67:44861
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.12.158.67
--webserver_port=37231
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.17634
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.67
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:49.965880 17634 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:49.966116 17634 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:49.966948 17634 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:49.968879 17640 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:49.969101 17642 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:49.968991 17639 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:49.970016 17634 server_base.cc:1061] running on GCE node
I20260502 14:06:49.970185 17634 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:49.970398 17634 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:49.971567 17634 hybrid_clock.cc:648] HybridClock initialized: now 1777730809971544 us; error 30 us; skew 500 ppm
I20260502 14:06:49.972892 17634 webserver.cc:492] Webserver started at http://127.12.158.67:37231/ using document root <none> and password file <none>
I20260502 14:06:49.973099 17634 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:49.973162 17634 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:49.974645 17634 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:49.975520 17648 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:49.975795 17634 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:49.975865 17634 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
uuid: "fcb69d1fc3094c95bd74e18f784e388d"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:49.976161 17634 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:50.013835 17634 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:50.014196 17634 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:50.014334 17634 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:50.014582 17634 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:50.015116 17655 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:50.016196 17634 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:50.016253 17634 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:50.016286 17634 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:50.017002 17634 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:50.017067 17634 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:50.017199 17655 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap starting.
I20260502 14:06:50.023816 17634 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.67:44861
I20260502 14:06:50.024178 17634 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
I20260502 14:06:50.026034 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 17634
I20260502 14:06:50.027602 17655 log.cc:826] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:50.031890 17762 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.67:44861 every 8 connection(s)
I20260502 14:06:50.052117 17763 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:50.052227 17763 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:50.052410 17763 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:50.052891 14069 ts_manager.cc:194] Re-registered known tserver with Master: fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:50.053369 14069 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.67:44683
I20260502 14:06:50.209792 17429 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:50.214124 17553 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:50.225374 17697 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:50.239204 17295 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:50.541553 17361 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:50.687054 17495 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:50.804118 17387 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 1/4 log segments. Stats: ops{read=4622 overwritten=0 applied=4621 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:06:50.826594 17629 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:50.921573 17521 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260502 14:06:50.962667 17655 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 1/4 log segments. Stats: ops{read=4622 overwritten=0 applied=4621 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:06:51.054093 17763 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:06:51.707473 17521 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 2/4 log segments. Stats: ops{read=9244 overwritten=0 applied=9242 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260502 14:06:52.146271 17387 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 2/4 log segments. Stats: ops{read=9244 overwritten=0 applied=9241 ignored=0} inserts{seen=461900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260502 14:06:52.265789 17655 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 2/4 log segments. Stats: ops{read=9244 overwritten=0 applied=9242 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260502 14:06:52.552057 17521 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 3/4 log segments. Stats: ops{read=14056 overwritten=0 applied=14053 ignored=0} inserts{seen=702450 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260502 14:06:52.553440 17521 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 4/4 log segments. Stats: ops{read=14060 overwritten=0 applied=14060 ignored=0} inserts{seen=702800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260502 14:06:52.553920 17521 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap complete.
I20260502 14:06:52.560096 17521 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent bootstrapping tablet: real 2.749s user 2.379s sys 0.356s
I20260502 14:06:52.561009 17521 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:52.561223 17521 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Becoming Follower/Learner. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Initialized, Role: FOLLOWER
I20260502 14:06:52.561375 17521 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14060, Last appended: 7.14060, Last appended by leader: 14060, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:52.561642 17521 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent starting tablet: real 0.001s user 0.002s sys 0.000s
W20260502 14:06:52.746537 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:52.802384 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:52.806339 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:52.850276 17807 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:52.850481 17807 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:52.851675 17807 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 9 pre-election: Requested pre-vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:52.855239 17449 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 9 candidate_status { last_received { term: 7 index: 14060 } } ignore_live_leader: false dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" is_pre_election: true
W20260502 14:06:52.856383 17516 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 9 pre-election: Tablet error from VoteRequest() call to peer 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:06:52.856096 17705 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 9 candidate_status { last_received { term: 7 index: 14060 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
W20260502 14:06:52.857033 17516 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 9 pre-election: Tablet error from VoteRequest() call to peer fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:06:52.857095 17516 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 9 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: c9dd32be290b436fb7f776b6f481451b; no voters: 8b5f3bbadc484c728f336c95a8d8fd78, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:06:52.857196 17807 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Leader pre-election lost for term 9. Reason: could not achieve majority
W20260502 14:06:52.989320 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:53.153095 17387 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 3/4 log segments. Stats: ops{read=13875 overwritten=0 applied=13874 ignored=0} inserts{seen=693500 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:06:53.186939 17387 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 4/4 log segments. Stats: ops{read=14061 overwritten=0 applied=14060 ignored=0} inserts{seen=702800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:06:53.187454 17387 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap complete.
W20260502 14:06:53.189328 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:53.193532 17387 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent bootstrapping tablet: real 3.524s user 3.048s sys 0.417s
I20260502 14:06:53.194083 17387 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 8 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:53.194725 17387 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 8 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8b5f3bbadc484c728f336c95a8d8fd78, State: Initialized, Role: FOLLOWER
I20260502 14:06:53.194857 17387 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14060, Last appended: 7.14061, Last appended by leader: 14061, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:53.195099 17387 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent starting tablet: real 0.002s user 0.003s sys 0.000s
W20260502 14:06:53.196285 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:53.244671 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:53.266324 17655 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 3/4 log segments. Stats: ops{read=14053 overwritten=0 applied=14050 ignored=0} inserts{seen=702300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260502 14:06:53.267534 17655 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 4/4 log segments. Stats: ops{read=14059 overwritten=0 applied=14056 ignored=0} inserts{seen=702600 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260502 14:06:53.268013 17655 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap complete.
I20260502 14:06:53.273699 17655 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent bootstrapping tablet: real 3.257s user 2.785s sys 0.458s
I20260502 14:06:53.274219 17655 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 8 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:53.274809 17655 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 8 FOLLOWER]: Becoming Follower/Learner. State: Replica: fcb69d1fc3094c95bd74e18f784e388d, State: Initialized, Role: FOLLOWER
I20260502 14:06:53.274968 17655 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14056, Last appended: 7.14059, Last appended by leader: 14059, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:53.275199 17655 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent starting tablet: real 0.001s user 0.003s sys 0.000s
I20260502 14:06:53.282191 17807 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:06:53.282259 17807 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:53.282441 17807 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 9 pre-election: Requested pre-vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:53.282683 17449 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 9 candidate_status { last_received { term: 7 index: 14060 } } ignore_live_leader: false dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" is_pre_election: true
I20260502 14:06:53.282691 17705 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 9 candidate_status { last_received { term: 7 index: 14060 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
I20260502 14:06:53.282812 17449 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 8 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate c9dd32be290b436fb7f776b6f481451b for term 9 because replica has last-logged OpId of term: 7 index: 14061, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 14060.
I20260502 14:06:53.282835 17705 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 8 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 8.
I20260502 14:06:53.283053 17516 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 9 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d; no voters: 8b5f3bbadc484c728f336c95a8d8fd78
I20260502 14:06:53.283159 17807 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Leader pre-election won for term 9
I20260502 14:06:53.283206 17807 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:06:53.283222 17807 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 8 FOLLOWER]: Advancing to term 9
I20260502 14:06:53.284164 17807 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 9 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:53.284260 17807 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 9 election: Requested vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:53.284439 17449 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 9 candidate_status { last_received { term: 7 index: 14060 } } ignore_live_leader: false dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
I20260502 14:06:53.284518 17449 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 8 FOLLOWER]: Advancing to term 9
I20260502 14:06:53.284489 17705 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 9 candidate_status { last_received { term: 7 index: 14060 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
I20260502 14:06:53.284580 17705 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 8 FOLLOWER]: Advancing to term 9
I20260502 14:06:53.285411 17449 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate c9dd32be290b436fb7f776b6f481451b for term 9 because replica has last-logged OpId of term: 7 index: 14061, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 14060.
I20260502 14:06:53.285498 17705 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 9 FOLLOWER]: Leader election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 9.
I20260502 14:06:53.285627 17516 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 9 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d; no voters: 8b5f3bbadc484c728f336c95a8d8fd78
I20260502 14:06:53.285720 17807 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 9 FOLLOWER]: Leader election won for term 9
I20260502 14:06:53.285861 17807 raft_consensus.cc:697] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 9 LEADER]: Becoming Leader. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Running, Role: LEADER
I20260502 14:06:53.285959 17807 consensus_queue.cc:237] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 14060, Committed index: 14060, Last appended: 7.14060, Last appended by leader: 14060, Current term: 9, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:53.286585 14069 catalog_manager.cc:5671] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b reported cstate change: term changed from 7 to 9, leader changed from 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66) to c9dd32be290b436fb7f776b6f481451b (127.12.158.65). New cstate: current_term: 9 leader_uuid: "c9dd32be290b436fb7f776b6f481451b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } health_report { overall_health: UNKNOWN } } }
W20260502 14:06:53.288332 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:53.297665 17449 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 9 FOLLOWER]: Refusing update from remote peer c9dd32be290b436fb7f776b6f481451b: Log matching property violated. Preceding OpId in replica: term: 7 index: 14061. Preceding OpId from leader: term: 9 index: 14062. (index mismatch)
I20260502 14:06:53.297708 17705 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 9 FOLLOWER]: Refusing update from remote peer c9dd32be290b436fb7f776b6f481451b: Log matching property violated. Preceding OpId in replica: term: 7 index: 14059. Preceding OpId from leader: term: 9 index: 14062. (index mismatch)
I20260502 14:06:53.297916 17807 consensus_queue.cc:1048] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Connected to new peer: Peer: permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 14061, Last known committed idx: 14060, Time since last communication: 0.000s
I20260502 14:06:53.298002 17807 consensus_queue.cc:1243] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Peer 8b5f3bbadc484c728f336c95a8d8fd78 log is divergent from this leader: its last log entry 7.14061 is not in this leader's log and it has not received anything from this leader yet. Falling back to committed index 14060
I20260502 14:06:53.298028 17818 consensus_queue.cc:1048] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Connected to new peer: Peer: permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 14061, Last known committed idx: 14056, Time since last communication: 0.000s
I20260502 14:06:53.298411 17449 pending_rounds.cc:85] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Aborting all ops after (but not including) 14060
I20260502 14:06:53.298501 17449 pending_rounds.cc:107] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Aborting uncommitted WRITE_OP operation due to leader change: 7.14061
I20260502 14:06:53.300101 17824 mvcc.cc:204] Tried to move back new op lower bound from 7281585411264950272 to 7281585411219648512. Current Snapshot: MvccSnapshot[applied={T|T < 7281585411264950272}]
I20260502 14:06:53.300911 17813 mvcc.cc:204] Tried to move back new op lower bound from 7281585411264950272 to 7281585411219648512. Current Snapshot: MvccSnapshot[applied={T|T < 7281585411264950272}]
W20260502 14:06:53.332206 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:53.332574 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:53.334013 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:54.114331 14695 scanner-internal.cc:458] Time spent opening tablet: real 5.708s user 0.002s sys 0.000s
W20260502 14:06:54.348132 14696 scanner-internal.cc:458] Time spent opening tablet: real 6.010s user 0.001s sys 0.001s
W20260502 14:06:54.434665 14697 scanner-internal.cc:458] Time spent opening tablet: real 6.011s user 0.001s sys 0.001s
I20260502 14:06:55.535154 17429 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260502 14:06:55.535884 17553 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260502 14:06:55.540804 17295 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:55.543581 17697 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:55.870201 14069 ts_manager.cc:284] Unset tserver state for c9dd32be290b436fb7f776b6f481451b from MAINTENANCE_MODE
I20260502 14:06:55.872541 14068 ts_manager.cc:284] Unset tserver state for 8b5f3bbadc484c728f336c95a8d8fd78 from MAINTENANCE_MODE
I20260502 14:06:55.873107 14064 ts_manager.cc:284] Unset tserver state for bd0a653794c34d9591e2d5c89c802493 from MAINTENANCE_MODE
I20260502 14:06:55.877038 14065 ts_manager.cc:284] Unset tserver state for fcb69d1fc3094c95bd74e18f784e388d from MAINTENANCE_MODE
I20260502 14:06:56.263330 14064 ts_manager.cc:295] Set tserver state for c9dd32be290b436fb7f776b6f481451b to MAINTENANCE_MODE
I20260502 14:06:56.264161 14065 ts_manager.cc:295] Set tserver state for 8b5f3bbadc484c728f336c95a8d8fd78 to MAINTENANCE_MODE
I20260502 14:06:56.274299 14065 ts_manager.cc:295] Set tserver state for fcb69d1fc3094c95bd74e18f784e388d to MAINTENANCE_MODE
I20260502 14:06:56.281569 14065 ts_manager.cc:295] Set tserver state for bd0a653794c34d9591e2d5c89c802493 to MAINTENANCE_MODE
I20260502 14:06:56.302487 17763 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:56.303581 17629 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:56.305009 17495 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:56.545244 17361 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:56.583133 17295 tablet_service.cc:1460] Tablet server bd0a653794c34d9591e2d5c89c802493 set to quiescing
I20260502 14:06:56.583204 17295 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:56.631137 17697 tablet_service.cc:1460] Tablet server fcb69d1fc3094c95bd74e18f784e388d set to quiescing
I20260502 14:06:56.631198 17697 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:56.644280 17553 tablet_service.cc:1460] Tablet server c9dd32be290b436fb7f776b6f481451b set to quiescing
I20260502 14:06:56.644340 17553 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260502 14:06:56.655664 17807 raft_consensus.cc:993] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: : Instructing follower 8b5f3bbadc484c728f336c95a8d8fd78 to start an election
I20260502 14:06:56.655726 17807 raft_consensus.cc:1081] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 9 LEADER]: Signalling peer 8b5f3bbadc484c728f336c95a8d8fd78 to start an election
I20260502 14:06:56.656970 17448 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "4710965d239c4545af4272cddced8dfb"
dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
from {username='slave'} at 127.12.158.65:50109
I20260502 14:06:56.657063 17448 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 9 FOLLOWER]: Starting forced leader election (received explicit request)
I20260502 14:06:56.657111 17448 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 9 FOLLOWER]: Advancing to term 10
I20260502 14:06:56.657948 17448 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:06:56.658182 17448 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 10 election: Requested vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:56.659479 17448 raft_consensus.cc:1240] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Rejecting Update request from peer c9dd32be290b436fb7f776b6f481451b for earlier term 9. Current term is 10. Ops: [9.17019-9.17021]
I20260502 14:06:56.659837 17937 consensus_queue.cc:1059] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 }, Status: INVALID_TERM, Last received: 9.17018, Next index: 17019, Last known committed idx: 17018, Time since last communication: 0.000s
I20260502 14:06:56.659968 17807 raft_consensus.cc:3055] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 9 LEADER]: Stepping down as leader of term 9
I20260502 14:06:56.660012 17807 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 9 LEADER]: Becoming Follower/Learner. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Running, Role: LEADER
I20260502 14:06:56.660058 17807 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 17021, Committed index: 17021, Last appended: 9.17021, Last appended by leader: 17021, Current term: 9, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
W20260502 14:06:56.660174 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:56.660178 17807 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 9 FOLLOWER]: Advancing to term 10
W20260502 14:06:56.660815 17824 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: Cannot assign timestamp to op. Tablet is not in leader mode. Last heard from a leader: 0.002s ago.
I20260502 14:06:56.661330 17429 tablet_service.cc:1460] Tablet server 8b5f3bbadc484c728f336c95a8d8fd78 set to quiescing
I20260502 14:06:56.661384 17429 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
W20260502 14:06:56.661948 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:56.662869 17705 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 10 candidate_status { last_received { term: 9 index: 17018 } } ignore_live_leader: true dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
I20260502 14:06:56.662958 17705 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 9 FOLLOWER]: Advancing to term 10
W20260502 14:06:56.663435 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:56.663789 17705 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 10 FOLLOWER]: Leader election vote request: Denying vote to candidate 8b5f3bbadc484c728f336c95a8d8fd78 for term 10 because replica has last-logged OpId of term: 9 index: 17021, which is greater than that of the candidate, which has last-logged OpId of term: 9 index: 17018.
I20260502 14:06:56.664804 17583 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 10 candidate_status { last_received { term: 9 index: 17018 } } ignore_live_leader: true dest_uuid: "c9dd32be290b436fb7f776b6f481451b"
I20260502 14:06:56.664970 17583 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 10 FOLLOWER]: Leader election vote request: Denying vote to candidate 8b5f3bbadc484c728f336c95a8d8fd78 for term 10 because replica has last-logged OpId of term: 9 index: 17021, which is greater than that of the candidate, which has last-logged OpId of term: 9 index: 17018.
W20260502 14:06:56.665133 17408 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:56.665313 17381 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 10 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78; no voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:06:56.666083 18022 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Leader election lost for term 10. Reason: could not achieve majority
W20260502 14:06:56.666623 17408 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.667891 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.671056 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.671680 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.673866 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.674747 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.675411 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.679870 17408 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.679996 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.682080 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.687235 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.688472 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.690584 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.694156 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.695349 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.700546 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.702980 17408 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.704087 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.711139 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.711527 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.715574 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.719729 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.722473 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.725288 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.729652 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.733117 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.739352 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.740262 17408 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.747646 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.752136 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.753777 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.761502 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.768013 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.769501 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.775019 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.784138 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.786185 17408 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.791695 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.801684 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.802569 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.807551 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.819346 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.820533 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.825034 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.836295 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.838253 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.846591 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.856801 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.856840 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.867724 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.879647 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.879662 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.890276 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.899049 17807 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:56.899293 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.901470 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.914988 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.921788 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.922260 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.940905 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.943831 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.947971 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:56.959652 18023 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:56.966593 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.969601 17408 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.972819 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:56.991210 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.996068 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:56.998476 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.018150 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.024082 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.026191 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.047675 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.052824 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.052824 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.078316 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.082218 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.082304 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.109295 18022 raft_consensus.cc:670] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: failed to trigger leader election: Illegal state: leader elections are disabled
W20260502 14:06:57.109434 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.111078 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.111908 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.140504 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.142552 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.143545 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.171236 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.175076 17675 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.175125 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.205304 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.208362 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.210026 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.237658 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.243633 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.245736 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.271196 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.277158 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.282258 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.309063 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.310946 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.319084 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.347680 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.347695 17406 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.354655 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.384289 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.385274 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.391248 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.425339 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.425343 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.432178 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.464769 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.465826 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.473997 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.506294 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.507314 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.512521 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.547286 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.551119 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.553550 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.591166 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.594261 17406 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.596333 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.635725 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.636809 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.641796 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.682723 17543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.682940 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.686618 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.731268 17406 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.731684 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.732247 17406 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.777786 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.778294 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.779650 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.824361 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.825752 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.828449 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:57.838765 17553 tablet_service.cc:1460] Tablet server c9dd32be290b436fb7f776b6f481451b set to quiescing
I20260502 14:06:57.838824 17553 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:57.862569 17429 tablet_service.cc:1460] Tablet server 8b5f3bbadc484c728f336c95a8d8fd78 set to quiescing
I20260502 14:06:57.862634 17429 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
W20260502 14:06:57.874058 17409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.874058 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.877506 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:57.924608 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.925552 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.928092 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:57.973732 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.977497 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:57.980029 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.026582 17405 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.028760 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.033715 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.080075 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.081640 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.087263 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.134701 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.136534 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.141731 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.187966 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.194133 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.195176 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.244877 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.248405 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.252923 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.300338 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.308310 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.311400 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.358189 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.367322 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.370378 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.417147 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.427824 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.431370 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.479827 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.489172 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.494333 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.539036 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.548070 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.557235 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.598829 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.611613 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.618140 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.660837 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.675645 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.680637 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.726465 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.741567 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.745786 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.789608 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.808295 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.812913 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:58.856335 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.874862 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.877843 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:58.924683 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.939800 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.943969 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:58.991823 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:59.007510 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:59.007869 17429 tablet_service.cc:1460] Tablet server 8b5f3bbadc484c728f336c95a8d8fd78 set to quiescing
I20260502 14:06:59.007923 17429 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260502 14:06:59.010063 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:59.060500 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:59.064070 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 17233
I20260502 14:06:59.069612 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.68:37147
--local_ip_for_outbound_sockets=127.12.158.68
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=44081
--webserver_interface=127.12.158.68
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:59.076810 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:59.080760 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:59.130681 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:59.145536 18057 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:59.145704 18057 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:59.145732 18057 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:59.147181 18057 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:59.147256 18057 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.68
W20260502 14:06:59.147785 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:06:59.148959 18057 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.68:37147
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.12.158.68
--webserver_port=44081
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.18057
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.68
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:59.149147 18057 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:59.149358 18057 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:59.149988 18057 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:59.151891 17407 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58936: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:06:59.152123 18062 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:59.152136 18065 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:59.152177 18063 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:59.152181 18057 server_base.cc:1061] running on GCE node
I20260502 14:06:59.152448 18057 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:59.152608 18057 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:59.153766 18057 hybrid_clock.cc:648] HybridClock initialized: now 1777730819153750 us; error 26 us; skew 500 ppm
I20260502 14:06:59.154841 18057 webserver.cc:492] Webserver started at http://127.12.158.68:44081/ using document root <none> and password file <none>
I20260502 14:06:59.155057 18057 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:59.155126 18057 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:59.156539 18057 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:59.157138 18071 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:59.157353 18057 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:59.157436 18057 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
uuid: "bd0a653794c34d9591e2d5c89c802493"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:59.157704 18057 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:59.180830 18057 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:59.181119 18057 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:59.181278 18057 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:59.181463 18057 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:59.181707 18057 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260502 14:06:59.181736 18057 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:59.181773 18057 ts_tablet_manager.cc:616] Registered 0 tablets
I20260502 14:06:59.181787 18057 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:59.187357 18057 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.68:37147
I20260502 14:06:59.187454 18184 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.68:37147 every 8 connection(s)
I20260502 14:06:59.187747 18057 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-3/data/info.pb
I20260502 14:06:59.191955 18185 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:59.192051 18185 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:59.192229 18185 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:59.192585 14065 ts_manager.cc:194] Re-registered known tserver with Master: bd0a653794c34d9591e2d5c89c802493 (127.12.158.68:37147)
I20260502 14:06:59.192932 14065 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.68:57627
I20260502 14:06:59.193989 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 18057
I20260502 14:06:59.194077 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 17365
W20260502 14:06:59.202656 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:59.206717 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.66:42349
--local_ip_for_outbound_sockets=127.12.158.66
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=33233
--webserver_interface=127.12.158.66
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260502 14:06:59.219928 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:59.221484 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:59.276940 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:59.281599 18189 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:59.281780 18189 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:59.281840 18189 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:59.283269 18189 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:59.283344 18189 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.66
I20260502 14:06:59.284862 18189 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.66:42349
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.12.158.66
--webserver_port=33233
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.18189
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.66
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:59.285087 18189 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:59.285295 18189 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:59.285950 18189 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
I20260502 14:06:59.287552 14695 meta_cache.cc:1510] marking tablet server 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349) as failed
W20260502 14:06:59.287644 14695 meta_cache.cc:302] tablet 4710965d239c4545af4272cddced8dfb: replica 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349) has failed: Network error: TS failed: Client connection negotiation failed: client connection to 127.12.158.66:42349: connect: Connection refused (error 111)
W20260502 14:06:59.287708 18195 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:59.287824 18197 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:59.287912 18194 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:59.287993 18189 server_base.cc:1061] running on GCE node
I20260502 14:06:59.288257 18189 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:59.288489 18189 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:59.289635 18189 hybrid_clock.cc:648] HybridClock initialized: now 1777730819289609 us; error 39 us; skew 500 ppm
I20260502 14:06:59.290637 18189 webserver.cc:492] Webserver started at http://127.12.158.66:33233/ using document root <none> and password file <none>
I20260502 14:06:59.290825 18189 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:59.290874 18189 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:59.292121 18189 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:59.292748 18204 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:59.292943 18189 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20260502 14:06:59.293025 18189 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:59.293284 18189 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260502 14:06:59.293958 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
W20260502 14:06:59.295998 17542 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37264: Illegal state: replica c9dd32be290b436fb7f776b6f481451b is not leader of this config: current role FOLLOWER
I20260502 14:06:59.304126 18189 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:59.304399 18189 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:59.304550 18189 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:59.304739 18189 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:59.305111 18211 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:59.305895 18189 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:59.305938 18189 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:59.305960 18189 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:59.306428 18189 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:59.306457 18189 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:59.306529 18211 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap starting.
I20260502 14:06:59.312500 18189 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.66:42349
I20260502 14:06:59.312587 18318 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.66:42349 every 8 connection(s)
I20260502 14:06:59.312856 18189 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-1/data/info.pb
I20260502 14:06:59.316944 18319 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:59.317034 18319 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:59.317222 18319 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:59.317693 14065 ts_manager.cc:194] Re-registered known tserver with Master: 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349)
I20260502 14:06:59.318085 14065 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.66:56581
I20260502 14:06:59.321005 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 18189
I20260502 14:06:59.321128 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 17500
I20260502 14:06:59.333016 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.65:34909
--local_ip_for_outbound_sockets=127.12.158.65
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=41631
--webserver_interface=127.12.158.65
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260502 14:06:59.385259 14696 meta_cache.cc:1510] marking tablet server c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909) as failed
I20260502 14:06:59.386227 18211 log.cc:826] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Log is configured to *not* fsync() on all Append() calls
W20260502 14:06:59.429862 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:06:59.430141 18322 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:59.430301 18322 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:59.430337 18322 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:59.431867 18322 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:59.431941 18322 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.65
I20260502 14:06:59.433461 18322 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.65:34909
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.12.158.65
--webserver_port=41631
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.18322
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.65
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:59.433696 18322 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:59.433916 18322 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:59.434544 18322 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:59.436506 18329 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:59.436528 18330 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:59.436636 18332 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:59.436705 18322 server_base.cc:1061] running on GCE node
I20260502 14:06:59.436918 18322 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:59.437111 18322 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:59.438270 18322 hybrid_clock.cc:648] HybridClock initialized: now 1777730819438249 us; error 35 us; skew 500 ppm
I20260502 14:06:59.439345 18322 webserver.cc:492] Webserver started at http://127.12.158.65:41631/ using document root <none> and password file <none>
I20260502 14:06:59.439545 18322 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:59.439623 18322 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:59.440831 18322 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
W20260502 14:06:59.441532 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:59.441534 18338 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:59.441685 18322 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20260502 14:06:59.441753 18322 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
uuid: "c9dd32be290b436fb7f776b6f481451b"
format_stamp: "Formatted at 2026-05-02 14:06:21 on dist-test-slave-23m0"
I20260502 14:06:59.442005 18322 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260502 14:06:59.445130 17676 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39268: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:06:59.456640 18322 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:59.456874 18322 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:59.457000 18322 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:59.457182 18322 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:59.457573 18345 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:59.458518 18322 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:59.458580 18322 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:59.458642 18322 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:59.459117 18322 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:59.459170 18322 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:59.459216 18345 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap starting.
I20260502 14:06:59.466502 18322 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.65:34909
I20260502 14:06:59.466826 18322 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-0/data/info.pb
I20260502 14:06:59.467969 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 18322
I20260502 14:06:59.468065 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 17634
I20260502 14:06:59.472357 18452 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.65:34909 every 8 connection(s)
I20260502 14:06:59.479835 18453 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:59.479974 18453 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:59.480227 18453 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:59.480791 14065 ts_manager.cc:194] Re-registered known tserver with Master: c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909)
I20260502 14:06:59.481345 14065 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.65:60653
I20260502 14:06:59.482618 12921 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskW6FTeC/build/release/bin/kudu
/tmp/dist-test-taskW6FTeC/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.12.158.67:44861
--local_ip_for_outbound_sockets=127.12.158.67
--tserver_master_addrs=127.12.158.126:36477
--webserver_port=37231
--webserver_interface=127.12.158.67
--builtin_ntp_servers=127.12.158.84:38351
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260502 14:06:59.555372 18345 log.cc:826] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Log is configured to *not* fsync() on all Append() calls
W20260502 14:06:59.583930 18456 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260502 14:06:59.584128 18456 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260502 14:06:59.584161 18456 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260502 14:06:59.586422 18456 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260502 14:06:59.586494 18456 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.12.158.67
I20260502 14:06:59.588829 18456 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.12.158.84:38351
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.12.158.67:44861
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.12.158.67
--webserver_port=37231
--tserver_master_addrs=127.12.158.126:36477
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.18456
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.12.158.67
--log_dir=/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 0ba7f54e15c9f673bf3e350f30b8d5f1a4037b79
build type RELEASE
built by None at 02 May 2026 13:43:15 UTC on e7f111948823
build id 11709
I20260502 14:06:59.589035 18456 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260502 14:06:59.589305 18456 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260502 14:06:59.590134 18456 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260502 14:06:59.591956 18466 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260502 14:06:59.591969 18463 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:59.592336 18456 server_base.cc:1061] running on GCE node
W20260502 14:06:59.592458 18464 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260502 14:06:59.592651 18456 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260502 14:06:59.592885 18456 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260502 14:06:59.594101 18456 hybrid_clock.cc:648] HybridClock initialized: now 1777730819594075 us; error 37 us; skew 500 ppm
I20260502 14:06:59.595479 18456 webserver.cc:492] Webserver started at http://127.12.158.67:37231/ using document root <none> and password file <none>
I20260502 14:06:59.595692 18456 fs_manager.cc:362] Metadata directory not provided
I20260502 14:06:59.595795 18456 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260502 14:06:59.597333 18456 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:59.598358 18472 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260502 14:06:59.598547 18456 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.000s sys 0.001s
I20260502 14:06:59.598683 18456 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data,/tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
uuid: "fcb69d1fc3094c95bd74e18f784e388d"
format_stamp: "Formatted at 2026-05-02 14:06:22 on dist-test-slave-23m0"
I20260502 14:06:59.599045 18456 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260502 14:06:59.609148 18456 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260502 14:06:59.609444 18456 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260502 14:06:59.609612 18456 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260502 14:06:59.609851 18456 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260502 14:06:59.610373 18479 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260502 14:06:59.611239 18456 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260502 14:06:59.611320 18456 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20260502 14:06:59.611351 18456 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260502 14:06:59.612076 18456 ts_tablet_manager.cc:616] Registered 1 tablets
I20260502 14:06:59.612150 18456 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s user 0.001s sys 0.000s
I20260502 14:06:59.612213 18479 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap starting.
I20260502 14:06:59.618688 18456 rpc_server.cc:307] RPC server started. Bound to: 127.12.158.67:44861
I20260502 14:06:59.619072 18456 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0/minicluster-data/ts-2/data/info.pb
I20260502 14:06:59.625674 18586 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.12.158.67:44861 every 8 connection(s)
I20260502 14:06:59.628206 12921 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskW6FTeC/build/release/bin/kudu as pid 18456
I20260502 14:06:59.632670 18587 heartbeater.cc:344] Connected to a master server at 127.12.158.126:36477
I20260502 14:06:59.632750 18587 heartbeater.cc:461] Registering TS with master...
I20260502 14:06:59.632921 18587 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:06:59.633446 14065 ts_manager.cc:194] Re-registered known tserver with Master: fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:06:59.633953 14065 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.12.158.67:54275
I20260502 14:06:59.720839 18479 log.cc:826] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Log is configured to *not* fsync() on all Append() calls
I20260502 14:06:59.796509 18253 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:59.810587 18368 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:59.812588 18521 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:06:59.825884 18119 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:07:00.193619 18185 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:07:00.255915 18211 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 1/4 log segments. Stats: ops{read=4622 overwritten=0 applied=4621 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:07:00.318873 18319 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:07:00.482203 18453 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:07:00.634788 18587 heartbeater.cc:499] Master 127.12.158.126:36477 was elected leader, sending a full tablet report...
I20260502 14:07:00.761375 18345 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260502 14:07:00.915907 18479 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 1/4 log segments. Stats: ops{read=4622 overwritten=0 applied=4621 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260502 14:07:01.043517 18211 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 2/4 log segments. Stats: ops{read=9244 overwritten=0 applied=9241 ignored=0} inserts{seen=461900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260502 14:07:01.853000 18211 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 3/4 log segments. Stats: ops{read=13866 overwritten=0 applied=13863 ignored=0} inserts{seen=692950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260502 14:07:02.012354 18345 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 2/4 log segments. Stats: ops{read=9244 overwritten=0 applied=9241 ignored=0} inserts{seen=461900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260502 14:07:02.175397 18479 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 2/4 log segments. Stats: ops{read=9244 overwritten=0 applied=9242 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260502 14:07:02.412184 18211 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap replayed 4/4 log segments. Stats: ops{read=17019 overwritten=1 applied=17018 ignored=0} inserts{seen=850650 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260502 14:07:02.412782 18211 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Bootstrap complete.
I20260502 14:07:02.419088 18211 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent bootstrapping tablet: real 3.113s user 2.671s sys 0.431s
I20260502 14:07:02.420257 18211 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:02.420476 18211 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8b5f3bbadc484c728f336c95a8d8fd78, State: Initialized, Role: FOLLOWER
I20260502 14:07:02.420608 18211 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17018, Last appended: 9.17018, Last appended by leader: 17018, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:02.420881 18211 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78: Time spent starting tablet: real 0.002s user 0.005s sys 0.000s
W20260502 14:07:02.672569 18232 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:07:02.674010 18232 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:07:02.684453 18232 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:07:02.791671 18630 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:07:02.791836 18630 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:02.792217 18630 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:07:02.796260 18407 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 11 candidate_status { last_received { term: 9 index: 17018 } } ignore_live_leader: false dest_uuid: "c9dd32be290b436fb7f776b6f481451b" is_pre_election: true
W20260502 14:07:02.797499 18205 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:07:02.796722 18541 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 11 candidate_status { last_received { term: 9 index: 17018 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
W20260502 14:07:02.797780 18206 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:07:02.797847 18206 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78; no voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:07:02.797969 18630 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Leader pre-election lost for term 11. Reason: could not achieve majority
I20260502 14:07:02.995386 18345 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 3/4 log segments. Stats: ops{read=13868 overwritten=0 applied=13864 ignored=0} inserts{seen=693000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 4 replicates
W20260502 14:07:03.001271 18232 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:07:03.008306 18232 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:07:03.017468 18232 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:07:03.097784 18479 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 3/4 log segments. Stats: ops{read=13865 overwritten=0 applied=13865 ignored=0} inserts{seen=693050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260502 14:07:03.232002 18630 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:07:03.232121 18630 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:03.232277 18630 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:07:03.232473 18407 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 11 candidate_status { last_received { term: 9 index: 17018 } } ignore_live_leader: false dest_uuid: "c9dd32be290b436fb7f776b6f481451b" is_pre_election: true
I20260502 14:07:03.232496 18541 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 11 candidate_status { last_received { term: 9 index: 17018 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
W20260502 14:07:03.232679 18205 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909): Illegal state: must be running to vote when last-logged opid is not known
W20260502 14:07:03.232743 18206 leader_election.cc:343] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861): Illegal state: must be running to vote when last-logged opid is not known
I20260502 14:07:03.232796 18206 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78; no voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:07:03.232863 18630 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Leader pre-election lost for term 11. Reason: could not achieve majority
W20260502 14:07:03.344563 18232 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:07:03.348610 18232 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:07:03.354750 18232 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:07:03.593334 18345 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap replayed 4/4 log segments. Stats: ops{read=17021 overwritten=0 applied=17021 ignored=0} inserts{seen=850800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260502 14:07:03.593844 18345 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Bootstrap complete.
I20260502 14:07:03.600184 18345 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent bootstrapping tablet: real 4.141s user 3.656s sys 0.464s
I20260502 14:07:03.600986 18345 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 10 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:03.601161 18345 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Initialized, Role: FOLLOWER
I20260502 14:07:03.601271 18345 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17021, Last appended: 9.17021, Last appended by leader: 17021, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:03.601524 18345 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b: Time spent starting tablet: real 0.001s user 0.001s sys 0.000s
I20260502 14:07:03.657934 18479 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap replayed 4/4 log segments. Stats: ops{read=17021 overwritten=0 applied=17021 ignored=0} inserts{seen=850800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260502 14:07:03.658502 18479 tablet_bootstrap.cc:492] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Bootstrap complete.
I20260502 14:07:03.664788 18479 ts_tablet_manager.cc:1403] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent bootstrapping tablet: real 4.053s user 3.541s sys 0.491s
I20260502 14:07:03.665617 18479 raft_consensus.cc:359] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 10 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:03.665796 18479 raft_consensus.cc:740] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: fcb69d1fc3094c95bd74e18f784e388d, State: Initialized, Role: FOLLOWER
I20260502 14:07:03.665913 18479 consensus_queue.cc:260] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17021, Last appended: 9.17021, Last appended by leader: 17021, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:03.666194 18479 ts_tablet_manager.cc:1434] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d: Time spent starting tablet: real 0.001s user 0.002s sys 0.000s
W20260502 14:07:03.693709 18232 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:07:03.705672 18232 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
W20260502 14:07:03.705680 18233 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36460: Illegal state: replica 8b5f3bbadc484c728f336c95a8d8fd78 is not leader of this config: current role FOLLOWER
I20260502 14:07:03.809393 18639 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:07:03.809484 18639 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:03.809656 18639 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers c9dd32be290b436fb7f776b6f481451b (127.12.158.65:34909), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:07:03.809931 18541 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 11 candidate_status { last_received { term: 9 index: 17018 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
I20260502 14:07:03.809957 18407 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" candidate_term: 11 candidate_status { last_received { term: 9 index: 17018 } } ignore_live_leader: false dest_uuid: "c9dd32be290b436fb7f776b6f481451b" is_pre_election: true
I20260502 14:07:03.810072 18541 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 10 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 8b5f3bbadc484c728f336c95a8d8fd78 for term 11 because replica has last-logged OpId of term: 9 index: 17021, which is greater than that of the candidate, which has last-logged OpId of term: 9 index: 17018.
I20260502 14:07:03.810089 18407 raft_consensus.cc:2410] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 10 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 8b5f3bbadc484c728f336c95a8d8fd78 for term 11 because replica has last-logged OpId of term: 9 index: 17021, which is greater than that of the candidate, which has last-logged OpId of term: 9 index: 17018.
I20260502 14:07:03.810384 18205 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78; no voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d
I20260502 14:07:03.810509 18639 raft_consensus.cc:2749] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Leader pre-election lost for term 11. Reason: could not achieve majority
W20260502 14:07:03.810946 18499 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58976: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:07:03.825572 18499 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58976: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
W20260502 14:07:03.827229 18499 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58976: Illegal state: replica fcb69d1fc3094c95bd74e18f784e388d is not leader of this config: current role FOLLOWER
I20260502 14:07:03.897704 18637 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260502 14:07:03.897855 18637 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:03.898161 18637 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:07:03.902042 18541 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 11 candidate_status { last_received { term: 9 index: 17021 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d" is_pre_election: true
I20260502 14:07:03.902228 18541 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 10 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 10.
I20260502 14:07:03.902166 18273 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 11 candidate_status { last_received { term: 9 index: 17021 } } ignore_live_leader: false dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" is_pre_election: true
I20260502 14:07:03.902347 18273 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 10.
I20260502 14:07:03.902449 18340 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: c9dd32be290b436fb7f776b6f481451b, fcb69d1fc3094c95bd74e18f784e388d; no voters:
I20260502 14:07:03.902587 18637 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 10 FOLLOWER]: Leader pre-election won for term 11
I20260502 14:07:03.902649 18637 raft_consensus.cc:493] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 10 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260502 14:07:03.902696 18637 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 10 FOLLOWER]: Advancing to term 11
I20260502 14:07:03.903717 18637 raft_consensus.cc:515] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 11 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:03.903889 18637 leader_election.cc:290] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 11 election: Requested vote from peers 8b5f3bbadc484c728f336c95a8d8fd78 (127.12.158.66:42349), fcb69d1fc3094c95bd74e18f784e388d (127.12.158.67:44861)
I20260502 14:07:03.904069 18541 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 11 candidate_status { last_received { term: 9 index: 17021 } } ignore_live_leader: false dest_uuid: "fcb69d1fc3094c95bd74e18f784e388d"
I20260502 14:07:03.904055 18273 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "4710965d239c4545af4272cddced8dfb" candidate_uuid: "c9dd32be290b436fb7f776b6f481451b" candidate_term: 11 candidate_status { last_received { term: 9 index: 17021 } } ignore_live_leader: false dest_uuid: "8b5f3bbadc484c728f336c95a8d8fd78"
I20260502 14:07:03.904129 18541 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 10 FOLLOWER]: Advancing to term 11
I20260502 14:07:03.904151 18273 raft_consensus.cc:3060] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 10 FOLLOWER]: Advancing to term 11
I20260502 14:07:03.905133 18273 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 11 FOLLOWER]: Leader election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 11.
I20260502 14:07:03.905246 18541 raft_consensus.cc:2468] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 11 FOLLOWER]: Leader election vote request: Granting yes vote for candidate c9dd32be290b436fb7f776b6f481451b in term 11.
I20260502 14:07:03.905359 18340 leader_election.cc:304] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [CANDIDATE]: Term 11 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 8b5f3bbadc484c728f336c95a8d8fd78, c9dd32be290b436fb7f776b6f481451b; no voters:
I20260502 14:07:03.905454 18637 raft_consensus.cc:2804] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 11 FOLLOWER]: Leader election won for term 11
I20260502 14:07:03.905580 18637 raft_consensus.cc:697] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [term 11 LEADER]: Becoming Leader. State: Replica: c9dd32be290b436fb7f776b6f481451b, State: Running, Role: LEADER
I20260502 14:07:03.905689 18637 consensus_queue.cc:237] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 17021, Committed index: 17021, Last appended: 9.17021, Last appended by leader: 17021, Current term: 11, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } }
I20260502 14:07:03.906394 14064 catalog_manager.cc:5671] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b reported cstate change: term changed from 9 to 11. New cstate: current_term: 11 leader_uuid: "c9dd32be290b436fb7f776b6f481451b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c9dd32be290b436fb7f776b6f481451b" member_type: VOTER last_known_addr { host: "127.12.158.65" port: 34909 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 } health_report { overall_health: UNKNOWN } } }
W20260502 14:07:03.921698 14697 scanner-internal.cc:458] Time spent opening tablet: real 5.707s user 0.001s sys 0.001s
I20260502 14:07:03.933207 18273 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P 8b5f3bbadc484c728f336c95a8d8fd78 [term 11 FOLLOWER]: Refusing update from remote peer c9dd32be290b436fb7f776b6f481451b: Log matching property violated. Preceding OpId in replica: term: 9 index: 17018. Preceding OpId from leader: term: 11 index: 17023. (index mismatch)
I20260502 14:07:03.933238 18541 raft_consensus.cc:1275] T 4710965d239c4545af4272cddced8dfb P fcb69d1fc3094c95bd74e18f784e388d [term 11 FOLLOWER]: Refusing update from remote peer c9dd32be290b436fb7f776b6f481451b: Log matching property violated. Preceding OpId in replica: term: 9 index: 17021. Preceding OpId from leader: term: 11 index: 17023. (index mismatch)
I20260502 14:07:03.933488 18644 consensus_queue.cc:1048] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Connected to new peer: Peer: permanent_uuid: "fcb69d1fc3094c95bd74e18f784e388d" member_type: VOTER last_known_addr { host: "127.12.158.67" port: 44861 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 17022, Last known committed idx: 17021, Time since last communication: 0.000s
I20260502 14:07:03.933701 18644 consensus_queue.cc:1048] T 4710965d239c4545af4272cddced8dfb P c9dd32be290b436fb7f776b6f481451b [LEADER]: Connected to new peer: Peer: permanent_uuid: "8b5f3bbadc484c728f336c95a8d8fd78" member_type: VOTER last_known_addr { host: "127.12.158.66" port: 42349 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 17022, Last known committed idx: 17018, Time since last communication: 0.000s
I20260502 14:07:03.935463 18651 mvcc.cc:204] Tried to move back new op lower bound from 7281585454827880448 to 7281585454718087168. Current Snapshot: MvccSnapshot[applied={T|T < 7281585454827880448}]
I20260502 14:07:03.935561 18652 mvcc.cc:204] Tried to move back new op lower bound from 7281585454827880448 to 7281585454718087168. Current Snapshot: MvccSnapshot[applied={T|T < 7281585454827880448}]
I20260502 14:07:03.938401 18654 mvcc.cc:204] Tried to move back new op lower bound from 7281585454827880448 to 7281585454718087168. Current Snapshot: MvccSnapshot[applied={T|T < 7281585454827880448}]
W20260502 14:07:04.092200 14695 scanner-internal.cc:458] Time spent opening tablet: real 6.008s user 0.001s sys 0.001s
W20260502 14:07:04.190935 14696 scanner-internal.cc:458] Time spent opening tablet: real 6.010s user 0.001s sys 0.000s
I20260502 14:07:05.105213 18119 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:07:05.106091 18368 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260502 14:07:05.112119 18253 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260502 14:07:05.120559 18521 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260502 14:07:05.541770 14072 ts_manager.cc:284] Unset tserver state for bd0a653794c34d9591e2d5c89c802493 from MAINTENANCE_MODE
I20260502 14:07:05.542543 14064 ts_manager.cc:284] Unset tserver state for c9dd32be290b436fb7f776b6f481451b from MAINTENANCE_MODE
I20260502 14:07:05.607301 14064 ts_manager.cc:284] Unset tserver state for fcb69d1fc3094c95bd74e18f784e388d from MAINTENANCE_MODE
I20260502 14:07:05.620980 14064 ts_manager.cc:284] Unset tserver state for 8b5f3bbadc484c728f336c95a8d8fd78 from MAINTENANCE_MODE
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/maintenance_mode-itest.cc:751: Failure
Value of: s.ok()
Actual: true
Expected: false
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/test_util.cc:402: Failure
Failed
Timed out waiting for assertion to pass.
I20260502 14:07:05.937155 18587 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:07:05.938975 18319 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:07:05.939337 18453 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:07:06.197773 18185 heartbeater.cc:507] Master 127.12.158.126:36477 requested a full tablet report, sending...
I20260502 14:07:07.358353 12921 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20260502 14:07:07.358469 12921 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID c9dd32be290b436fb7f776b6f481451b and pid 18322
************************ BEGIN STACKS **************************
[New LWP 18325]
[New LWP 18326]
[New LWP 18327]
[New LWP 18328]
[New LWP 18334]
[New LWP 18335]
[New LWP 18336]
[New LWP 18339]
[New LWP 18340]
[New LWP 18341]
[New LWP 18342]
[New LWP 18343]
[New LWP 18344]
[New LWP 18346]
[New LWP 18347]
[New LWP 18348]
[New LWP 18349]
[New LWP 18350]
[New LWP 18351]
[New LWP 18352]
[New LWP 18353]
[New LWP 18354]
[New LWP 18355]
[New LWP 18356]
[New LWP 18357]
[New LWP 18358]
[New LWP 18359]
[New LWP 18360]
[New LWP 18361]
[New LWP 18362]
[New LWP 18363]
[New LWP 18364]
[New LWP 18365]
[New LWP 18366]
[New LWP 18367]
[New LWP 18368]
[New LWP 18369]
[New LWP 18370]
[New LWP 18371]
[New LWP 18372]
[New LWP 18373]
[New LWP 18374]
[New LWP 18375]
[New LWP 18376]
[New LWP 18377]
[New LWP 18378]
[New LWP 18379]
[New LWP 18380]
[New LWP 18381]
[New LWP 18382]
[New LWP 18383]
[New LWP 18384]
[New LWP 18385]
[New LWP 18386]
[New LWP 18387]
[New LWP 18388]
[New LWP 18389]
[New LWP 18390]
[New LWP 18391]
[New LWP 18392]
[New LWP 18393]
[New LWP 18394]
[New LWP 18395]
[New LWP 18396]
[New LWP 18397]
[New LWP 18398]
[New LWP 18399]
[New LWP 18400]
[New LWP 18401]
[New LWP 18402]
[New LWP 18403]
[New LWP 18404]
[New LWP 18405]
[New LWP 18406]
[New LWP 18407]
[New LWP 18408]
[New LWP 18409]
[New LWP 18410]
[New LWP 18411]
[New LWP 18412]
[New LWP 18413]
[New LWP 18414]
[New LWP 18415]
[New LWP 18416]
[New LWP 18417]
[New LWP 18418]
[New LWP 18419]
[New LWP 18420]
[New LWP 18421]
[New LWP 18422]
[New LWP 18423]
[New LWP 18424]
[New LWP 18425]
[New LWP 18426]
[New LWP 18427]
[New LWP 18428]
[New LWP 18429]
[New LWP 18430]
[New LWP 18431]
[New LWP 18432]
[New LWP 18433]
[New LWP 18434]
[New LWP 18435]
[New LWP 18436]
[New LWP 18437]
[New LWP 18438]
[New LWP 18439]
[New LWP 18440]
[New LWP 18441]
[New LWP 18442]
[New LWP 18443]
[New LWP 18444]
[New LWP 18445]
[New LWP 18446]
[New LWP 18447]
[New LWP 18448]
[New LWP 18449]
[New LWP 18450]
[New LWP 18451]
[New LWP 18452]
[New LWP 18453]
[New LWP 18454]
[New LWP 18667]
[New LWP 18756]
0x00007fd1e5a7ad50 in ?? ()
Id Target Id Frame
* 1 LWP 18322 "kudu" 0x00007fd1e5a7ad50 in ?? ()
2 LWP 18325 "kudu" 0x00007fd1e5a76fb9 in ?? ()
3 LWP 18326 "kudu" 0x00007fd1e5a76fb9 in ?? ()
4 LWP 18327 "kudu" 0x00007fd1e5a76fb9 in ?? ()
5 LWP 18328 "kernel-watcher-" 0x00007fd1e5a76fb9 in ?? ()
6 LWP 18334 "ntp client-1833" 0x00007fd1e5a7a9e2 in ?? ()
7 LWP 18335 "file cache-evic" 0x00007fd1e5a76fb9 in ?? ()
8 LWP 18336 "sq_acceptor" 0x00007fd1e3b1cbb9 in ?? ()
9 LWP 18339 "rpc reactor-183" 0x00007fd1e3b29947 in ?? ()
10 LWP 18340 "rpc reactor-183" 0x00007fd1e3b29947 in ?? ()
11 LWP 18341 "rpc reactor-183" 0x00007fd1e3b29947 in ?? ()
12 LWP 18342 "rpc reactor-183" 0x00007fd1e3b29947 in ?? ()
13 LWP 18343 "MaintenanceMgr " 0x00007fd1e5a76ad3 in ?? ()
14 LWP 18344 "txn-status-mana" 0x00007fd1e5a76fb9 in ?? ()
15 LWP 18346 "collect_and_rem" 0x00007fd1e5a76fb9 in ?? ()
16 LWP 18347 "tc-session-exp-" 0x00007fd1e5a76fb9 in ?? ()
17 LWP 18348 "rpc worker-1834" 0x00007fd1e5a76ad3 in ?? ()
18 LWP 18349 "rpc worker-1834" 0x00007fd1e5a76ad3 in ?? ()
19 LWP 18350 "rpc worker-1835" 0x00007fd1e5a76ad3 in ?? ()
20 LWP 18351 "rpc worker-1835" 0x00007fd1e5a76ad3 in ?? ()
21 LWP 18352 "rpc worker-1835" 0x00007fd1e5a76ad3 in ?? ()
22 LWP 18353 "rpc worker-1835" 0x00007fd1e5a76ad3 in ?? ()
23 LWP 18354 "rpc worker-1835" 0x00007fd1e5a76ad3 in ?? ()
24 LWP 18355 "rpc worker-1835" 0x00007fd1e5a76ad3 in ?? ()
25 LWP 18356 "rpc worker-1835" 0x00007fd1e5a76ad3 in ?? ()
26 LWP 18357 "rpc worker-1835" 0x00007fd1e5a76ad3 in ?? ()
27 LWP 18358 "rpc worker-1835" 0x00007fd1e5a76ad3 in ?? ()
28 LWP 18359 "rpc worker-1835" 0x00007fd1e5a76ad3 in ?? ()
29 LWP 18360 "rpc worker-1836" 0x00007fd1e5a76ad3 in ?? ()
30 LWP 18361 "rpc worker-1836" 0x00007fd1e5a76ad3 in ?? ()
31 LWP 18362 "rpc worker-1836" 0x00007fd1e5a76ad3 in ?? ()
32 LWP 18363 "rpc worker-1836" 0x00007fd1e5a76ad3 in ?? ()
33 LWP 18364 "rpc worker-1836" 0x00007fd1e5a76ad3 in ?? ()
34 LWP 18365 "rpc worker-1836" 0x00007fd1e5a76ad3 in ?? ()
35 LWP 18366 "rpc worker-1836" 0x00007fd1e5a76ad3 in ?? ()
36 LWP 18367 "rpc worker-1836" 0x00007fd1e5a76ad3 in ?? ()
37 LWP 18368 "rpc worker-1836" 0x00007fd1e5a76ad3 in ?? ()
38 LWP 18369 "rpc worker-1836" 0x00007fd1e5a76ad3 in ?? ()
39 LWP 18370 "rpc worker-1837" 0x00007fd1e5a76ad3 in ?? ()
40 LWP 18371 "rpc worker-1837" 0x00007fd1e5a76ad3 in ?? ()
41 LWP 18372 "rpc worker-1837" 0x00007fd1e5a76ad3 in ?? ()
42 LWP 18373 "rpc worker-1837" 0x00007fd1e5a76ad3 in ?? ()
43 LWP 18374 "rpc worker-1837" 0x00007fd1e5a76ad3 in ?? ()
44 LWP 18375 "rpc worker-1837" 0x00007fd1e5a76ad3 in ?? ()
45 LWP 18376 "rpc worker-1837" 0x00007fd1e5a76ad3 in ?? ()
46 LWP 18377 "rpc worker-1837" 0x00007fd1e5a76ad3 in ?? ()
47 LWP 18378 "rpc worker-1837" 0x00007fd1e5a76ad3 in ?? ()
48 LWP 18379 "rpc worker-1837" 0x00007fd1e5a76ad3 in ?? ()
49 LWP 18380 "rpc worker-1838" 0x00007fd1e5a76ad3 in ?? ()
50 LWP 18381 "rpc worker-1838" 0x00007fd1e5a76ad3 in ?? ()
51 LWP 18382 "rpc worker-1838" 0x00007fd1e5a76ad3 in ?? ()
52 LWP 18383 "rpc worker-1838" 0x00007fd1e5a76ad3 in ?? ()
53 LWP 18384 "rpc worker-1838" 0x00007fd1e5a76ad3 in ?? ()
54 LWP 18385 "rpc worker-1838" 0x00007fd1e5a76ad3 in ?? ()
55 LWP 18386 "rpc worker-1838" 0x00007fd1e5a76ad3 in ?? ()
56 LWP 18387 "rpc worker-1838" 0x00007fd1e5a76ad3 in ?? ()
57 LWP 18388 "rpc worker-1838" 0x00007fd1e5a76ad3 in ?? ()
58 LWP 18389 "rpc worker-1838" 0x00007fd1e5a76ad3 in ?? ()
59 LWP 18390 "rpc worker-1839" 0x00007fd1e5a76ad3 in ?? ()
60 LWP 18391 "rpc worker-1839" 0x00007fd1e5a76ad3 in ?? ()
61 LWP 18392 "rpc worker-1839" 0x00007fd1e5a76ad3 in ?? ()
62 LWP 18393 "rpc worker-1839" 0x00007fd1e5a76ad3 in ?? ()
63 LWP 18394 "rpc worker-1839" 0x00007fd1e5a76ad3 in ?? ()
64 LWP 18395 "rpc worker-1839" 0x00007fd1e5a76ad3 in ?? ()
65 LWP 18396 "rpc worker-1839" 0x00007fd1e5a76ad3 in ?? ()
66 LWP 18397 "rpc worker-1839" 0x00007fd1e5a76ad3 in ?? ()
67 LWP 18398 "rpc worker-1839" 0x00007fd1e5a76ad3 in ?? ()
68 LWP 18399 "rpc worker-1839" 0x00007fd1e5a76ad3 in ?? ()
69 LWP 18400 "rpc worker-1840" 0x00007fd1e5a76ad3 in ?? ()
70 LWP 18401 "rpc worker-1840" 0x00007fd1e5a76ad3 in ?? ()
71 LWP 18402 "rpc worker-1840" 0x00007fd1e5a76ad3 in ?? ()
72 LWP 18403 "rpc worker-1840" 0x00007fd1e5a76ad3 in ?? ()
73 LWP 18404 "rpc worker-1840" 0x00007fd1e5a76ad3 in ?? ()
74 LWP 18405 "rpc worker-1840" 0x00007fd1e5a76ad3 in ?? ()
75 LWP 18406 "rpc worker-1840" 0x00007fd1e5a76ad3 in ?? ()
76 LWP 18407 "rpc worker-1840" 0x00007fd1e5a76ad3 in ?? ()
77 LWP 18408 "rpc worker-1840" 0x00007fd1e5a76ad3 in ?? ()
78 LWP 18409 "rpc worker-1840" 0x00007fd1e5a76ad3 in ?? ()
79 LWP 18410 "rpc worker-1841" 0x00007fd1e5a76ad3 in ?? ()
80 LWP 18411 "rpc worker-1841" 0x00007fd1e5a76ad3 in ?? ()
81 LWP 18412 "rpc worker-1841" 0x00007fd1e5a76ad3 in ?? ()
82 LWP 18413 "rpc worker-1841" 0x00007fd1e5a76ad3 in ?? ()
83 LWP 18414 "rpc worker-1841" 0x00007fd1e5a76ad3 in ?? ()
84 LWP 18415 "rpc worker-1841" 0x00007fd1e5a76ad3 in ?? ()
85 LWP 18416 "rpc worker-1841" 0x00007fd1e5a76ad3 in ?? ()
86 LWP 18417 "rpc worker-1841" 0x00007fd1e5a76ad3 in ?? ()
87 LWP 18418 "rpc worker-1841" 0x00007fd1e5a76ad3 in ?? ()
88 LWP 18419 "rpc worker-1841" 0x00007fd1e5a76ad3 in ?? ()
89 LWP 18420 "rpc worker-1842" 0x00007fd1e5a76ad3 in ?? ()
90 LWP 18421 "rpc worker-1842" 0x00007fd1e5a76ad3 in ?? ()
91 LWP 18422 "rpc worker-1842" 0x00007fd1e5a76ad3 in ?? ()
92 LWP 18423 "rpc worker-1842" 0x00007fd1e5a76ad3 in ?? ()
93 LWP 18424 "rpc worker-1842" 0x00007fd1e5a76ad3 in ?? ()
94 LWP 18425 "rpc worker-1842" 0x00007fd1e5a76ad3 in ?? ()
95 LWP 18426 "rpc worker-1842" 0x00007fd1e5a76ad3 in ?? ()
96 LWP 18427 "rpc worker-1842" 0x00007fd1e5a76ad3 in ?? ()
97 LWP 18428 "rpc worker-1842" 0x00007fd1e5a76ad3 in ?? ()
98 LWP 18429 "rpc worker-1842" 0x00007fd1e5a76ad3 in ?? ()
99 LWP 18430 "rpc worker-1843" 0x00007fd1e5a76ad3 in ?? ()
100 LWP 18431 "rpc worker-1843" 0x00007fd1e5a76ad3 in ?? ()
101 LWP 18432 "rpc worker-1843" 0x00007fd1e5a76ad3 in ?? ()
102 LWP 18433 "rpc worker-1843" 0x00007fd1e5a76ad3 in ?? ()
103 LWP 18434 "rpc worker-1843" 0x00007fd1e5a76ad3 in ?? ()
104 LWP 18435 "rpc worker-1843" 0x00007fd1e5a76ad3 in ?? ()
105 LWP 18436 "rpc worker-1843" 0x00007fd1e5a76ad3 in ?? ()
106 LWP 18437 "rpc worker-1843" 0x00007fd1e5a76ad3 in ?? ()
107 LWP 18438 "rpc worker-1843" 0x00007fd1e5a76ad3 in ?? ()
108 LWP 18439 "rpc worker-1843" 0x00007fd1e5a76ad3 in ?? ()
109 LWP 18440 "rpc worker-1844" 0x00007fd1e5a76ad3 in ?? ()
110 LWP 18441 "rpc worker-1844" 0x00007fd1e5a76ad3 in ?? ()
111 LWP 18442 "rpc worker-1844" 0x00007fd1e5a76ad3 in ?? ()
112 LWP 18443 "rpc worker-1844" 0x00007fd1e5a76ad3 in ?? ()
113 LWP 18444 "rpc worker-1844" 0x00007fd1e5a76ad3 in ?? ()
114 LWP 18445 "rpc worker-1844" 0x00007fd1e5a76ad3 in ?? ()
115 LWP 18446 "rpc worker-1844" 0x00007fd1e5a76ad3 in ?? ()
116 LWP 18447 "rpc worker-1844" 0x00007fd1e5a76ad3 in ?? ()
117 LWP 18448 "diag-logger-184" 0x00007fd1e5a76fb9 in ?? ()
118 LWP 18449 "result-tracker-" 0x00007fd1e5a76fb9 in ?? ()
119 LWP 18450 "excess-log-dele" 0x00007fd1e5a76fb9 in ?? ()
120 LWP 18451 "tcmalloc-memory" 0x00007fd1e5a76fb9 in ?? ()
121 LWP 18452 "acceptor-18452" 0x00007fd1e3b2afc7 in ?? ()
122 LWP 18453 "heartbeat-18453" 0x00007fd1e5a76fb9 in ?? ()
123 LWP 18454 "maintenance_sch" 0x00007fd1e5a76fb9 in ?? ()
124 LWP 18667 "raft [worker]-1" 0x00007fd1e5a76fb9 in ?? ()
125 LWP 18756 "raft [worker]-1" 0x00007fd1e5a76fb9 in ?? ()
Thread 125 (LWP 18756):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000100000081 in ?? ()
#4 0x00007fd19a730764 in ?? ()
#5 0x00007fd19a730510 in ?? ()
#6 0x0000000000000003 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x00007fd19a730530 in ?? ()
#9 0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fd19a730590 in ?? ()
#12 0x00007fd1e56b68d1 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 124 (LWP 18667):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000423 in ?? ()
#3 0x0000000100000081 in ?? ()
#4 0x00007fd19772a764 in ?? ()
#5 0x00007fd19772a510 in ?? ()
#6 0x0000000000000847 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x00007fd19772a530 in ?? ()
#9 0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fd19772a590 in ?? ()
#12 0x00007fd1e56b68d1 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 123 (LWP 18454):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000021 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055660c279e50 in ?? ()
#5 0x00007fd19c734470 in ?? ()
#6 0x0000000000000042 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 18453):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000000b in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055660c1cb630 in ?? ()
#5 0x00007fd19cf353f0 in ?? ()
#6 0x0000000000000016 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 121 (LWP 18452):
#0 0x00007fd1e3b2afc7 in ?? ()
#1 0x00007fd19d7360d8 in ?? ()
#2 0x00000001e56c7672 in ?? ()
#3 0x00007fd1e54e6060 in ?? ()
#4 0x0000000000080800 in ?? ()
#5 0x00007fd19d7363e0 in ?? ()
#6 0x00007fd19d736090 in ?? ()
#7 0x000055660c184978 in ?? ()
#8 0x00007fd1e56cd1c9 in ?? ()
#9 0x00007fd19d736510 in ?? ()
#10 0x00007fd19d736700 in ?? ()
#11 0x0000008000000004 in ?? ()
#12 0x00007fd1e2af95f9 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 120 (LWP 18451):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffee0011a90 in ?? ()
#5 0x00007fd19df37670 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 18450):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 118 (LWP 18449):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055660c0fc3e0 in ?? ()
#5 0x00007fd19ef39680 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 18448):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055660c44fc90 in ?? ()
#5 0x00007fd19f73a550 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 18447):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 115 (LWP 18446):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 114 (LWP 18445):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 18444):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 18443):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 18442):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 18441):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 18440):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 18439):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 18438):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 18437):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 18436):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 18435):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 18434):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 18433):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 18432):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000008 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055660c453ab8 in ?? ()
#4 0x00007fd1a774a5c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fd1a774a5e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 100 (LWP 18431):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 18430):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 18429):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 18428):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 18427):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 18426):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 18425):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 18424):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 18423):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 18422):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 18421):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 18420):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 18419):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 18418):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 18417):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 18416):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 18415):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 18414):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 18413):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 18412):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 18411):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 18410):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 18409):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 18408):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 18407):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000005 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055660c443c3c in ?? ()
#4 0x00007fd1b3f635c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fd1b3f635e0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055660c443c28 in ?? ()
#9 0x00007fd1e5a76770 in ?? ()
#10 0x00007fd1b3f635e0 in ?? ()
#11 0x00007fd1b3f63640 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 75 (LWP 18406):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 74 (LWP 18405):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 18404):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 18403):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 18402):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 18401):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 18400):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 18399):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 18398):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 18397):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 18396):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 18395):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 18394):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 18393):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 18392):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 18391):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 18390):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 18389):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 18388):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 18387):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 55 (LWP 18386):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 18385):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 18384):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 18383):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 18382):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 18381):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 18380):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 18379):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 18378):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 18377):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 18376):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 18375):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 18374):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 18373):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 18372):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 18371):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 18370):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 18369):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 18368):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055660c453a38 in ?? ()
#4 0x00007fd1c778a5c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fd1c778a5e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 36 (LWP 18367):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000191 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055660c4426bc in ?? ()
#4 0x00007fd1c7f8b5c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fd1c7f8b5e0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055660c4426a8 in ?? ()
#9 0x00007fd1e5a76770 in ?? ()
#10 0x00007fd1c7f8b5e0 in ?? ()
#11 0x00007fd1c7f8b640 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 35 (LWP 18366):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000255 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055660c44263c in ?? ()
#4 0x00007fd1c878c5c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fd1c878c5e0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055660c442628 in ?? ()
#9 0x00007fd1e5a76770 in ?? ()
#10 0x00007fd1c878c5e0 in ?? ()
#11 0x00007fd1c878c640 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 34 (LWP 18365):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 33 (LWP 18364):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 32 (LWP 18363):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 31 (LWP 18362):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 30 (LWP 18361):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 18360):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 18359):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 18358):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 18357):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 18356):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x000000000000022e in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055660c4539b8 in ?? ()
#4 0x00007fd1cd7965c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fd1cd7965e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 24 (LWP 18355):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 18354):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 18353):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 18352):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 18351):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 18350):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 18349):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 18348):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 18347):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 15 (LWP 18346):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055660c0e2b88 in ?? ()
#5 0x00007fd1d27a06a0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 18344):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 13 (LWP 18343):
#0 0x00007fd1e5a76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 18342):
#0 0x00007fd1e3b29947 in ?? ()
#1 0x00007fd1d47a4680 in ?? ()
#2 0x00007fd1df0af571 in ?? ()
#3 0x00007fd1d47a4680 in ?? ()
#4 0x000055660c1dd398 in ?? ()
#5 0x00007fd1d47a46c0 in ?? ()
#6 0x00007fd1d47a4840 in ?? ()
#7 0x000055660c2893f0 in ?? ()
#8 0x00007fd1df0b125d in ?? ()
#9 0x3fb965e5da53c000 in ?? ()
#10 0x000055660c1cec00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055660c1cec00 in ?? ()
#13 0x000000000c1dd398 in ?? ()
#14 0x0000556600000000 in ?? ()
#15 0x41da7d802edba1ca in ?? ()
#16 0x000055660c2893f0 in ?? ()
#17 0x00007fd1d47a4720 in ?? ()
#18 0x00007fd1df0b5ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb965e5da53c000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 18341):
#0 0x00007fd1e3b29947 in ?? ()
#1 0x00007fd1d4fa5680 in ?? ()
#2 0x00007fd1df0af571 in ?? ()
#3 0x00007fd1d4fa5680 in ?? ()
#4 0x000055660c1dd018 in ?? ()
#5 0x00007fd1d4fa56c0 in ?? ()
#6 0x00007fd1d4fa5840 in ?? ()
#7 0x000055660c2893f0 in ?? ()
#8 0x00007fd1df0b125d in ?? ()
#9 0x3fb3330ffc108000 in ?? ()
#10 0x000055660c1ce100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055660c1ce100 in ?? ()
#13 0x000000000c1dd018 in ?? ()
#14 0x0000556600000000 in ?? ()
#15 0x41da7d802edba1ca in ?? ()
#16 0x000055660c2893f0 in ?? ()
#17 0x00007fd1d4fa5720 in ?? ()
#18 0x00007fd1df0b5ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb3330ffc108000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 18340):
#0 0x00007fd1e3b29947 in ?? ()
#1 0x00007fd1d57a6680 in ?? ()
#2 0x00007fd1df0af571 in ?? ()
#3 0x00007fd1d57a6680 in ?? ()
#4 0x000055660c1dd558 in ?? ()
#5 0x00007fd1d57a66c0 in ?? ()
#6 0x00007fd1d57a6840 in ?? ()
#7 0x000055660c2893f0 in ?? ()
#8 0x00007fd1df0b125d in ?? ()
#9 0x3f9725edf2ae0000 in ?? ()
#10 0x000055660c1cdb80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055660c1cdb80 in ?? ()
#13 0x000000000c1dd558 in ?? ()
#14 0x0000556600000000 in ?? ()
#15 0x41da7d802edba1ca in ?? ()
#16 0x000055660c2893f0 in ?? ()
#17 0x00007fd1d57a6720 in ?? ()
#18 0x00007fd1df0b5ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3f9725edf2ae0000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 9 (LWP 18339):
#0 0x00007fd1e3b29947 in ?? ()
#1 0x00007fd1d738a680 in ?? ()
#2 0x00007fd1df0af571 in ?? ()
#3 0x00007fd1d738a680 in ?? ()
#4 0x000055660c1dce58 in ?? ()
#5 0x00007fd1d738a6c0 in ?? ()
#6 0x00007fd1d738a840 in ?? ()
#7 0x000055660c2893f0 in ?? ()
#8 0x00007fd1df0b125d in ?? ()
#9 0x3fb9651ad9394000 in ?? ()
#10 0x000055660c1ce680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055660c1ce680 in ?? ()
#13 0x000000000c1dce58 in ?? ()
#14 0x0000556600000000 in ?? ()
#15 0x41da7d802edba1ca in ?? ()
#16 0x000055660c2893f0 in ?? ()
#17 0x00007fd1d738a720 in ?? ()
#18 0x00007fd1df0b5ba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 18336):
#0 0x00007fd1e3b1cbb9 in ?? ()
#1 0x00007fd1d8b8d840 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 18335):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 18334):
#0 0x00007fd1e5a7a9e2 in ?? ()
#1 0x000055660c0fdee0 in ?? ()
#2 0x00007fd1d7b8b4d0 in ?? ()
#3 0x00007fd1d7b8b450 in ?? ()
#4 0x00007fd1d7b8b570 in ?? ()
#5 0x00007fd1d7b8b790 in ?? ()
#6 0x00007fd1d7b8b7a0 in ?? ()
#7 0x00007fd1d7b8b4e0 in ?? ()
#8 0x00007fd1d7b8b4d0 in ?? ()
#9 0x000055660c0fdc80 in ?? ()
#10 0x00007fd1e608697f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 18328):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000029 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055660c2834c8 in ?? ()
#5 0x00007fd1d9b8f430 in ?? ()
#6 0x0000000000000052 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 18327):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055660c0e2848 in ?? ()
#5 0x00007fd1da390790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 18326):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055660c0e22a8 in ?? ()
#5 0x00007fd1dab91790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 18325):
#0 0x00007fd1e5a76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055660c0e2188 in ?? ()
#5 0x00007fd1db392790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 18322):
#0 0x00007fd1e5a7ad50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260502 14:07:07.863198 12921 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID 8b5f3bbadc484c728f336c95a8d8fd78 and pid 18189
************************ BEGIN STACKS **************************
[New LWP 18190]
[New LWP 18191]
[New LWP 18192]
[New LWP 18193]
[New LWP 18200]
[New LWP 18201]
[New LWP 18202]
[New LWP 18205]
[New LWP 18206]
[New LWP 18207]
[New LWP 18208]
[New LWP 18209]
[New LWP 18210]
[New LWP 18212]
[New LWP 18213]
[New LWP 18214]
[New LWP 18215]
[New LWP 18216]
[New LWP 18217]
[New LWP 18218]
[New LWP 18219]
[New LWP 18220]
[New LWP 18221]
[New LWP 18222]
[New LWP 18223]
[New LWP 18224]
[New LWP 18225]
[New LWP 18226]
[New LWP 18227]
[New LWP 18228]
[New LWP 18229]
[New LWP 18230]
[New LWP 18231]
[New LWP 18232]
[New LWP 18233]
[New LWP 18234]
[New LWP 18235]
[New LWP 18236]
[New LWP 18237]
[New LWP 18238]
[New LWP 18239]
[New LWP 18240]
[New LWP 18241]
[New LWP 18242]
[New LWP 18243]
[New LWP 18244]
[New LWP 18245]
[New LWP 18246]
[New LWP 18247]
[New LWP 18248]
[New LWP 18249]
[New LWP 18250]
[New LWP 18251]
[New LWP 18252]
[New LWP 18253]
[New LWP 18254]
[New LWP 18255]
[New LWP 18256]
[New LWP 18257]
[New LWP 18258]
[New LWP 18259]
[New LWP 18260]
[New LWP 18261]
[New LWP 18262]
[New LWP 18263]
[New LWP 18264]
[New LWP 18265]
[New LWP 18266]
[New LWP 18267]
[New LWP 18268]
[New LWP 18269]
[New LWP 18270]
[New LWP 18271]
[New LWP 18272]
[New LWP 18273]
[New LWP 18274]
[New LWP 18275]
[New LWP 18276]
[New LWP 18277]
[New LWP 18278]
[New LWP 18279]
[New LWP 18280]
[New LWP 18281]
[New LWP 18282]
[New LWP 18283]
[New LWP 18284]
[New LWP 18285]
[New LWP 18286]
[New LWP 18287]
[New LWP 18288]
[New LWP 18289]
[New LWP 18290]
[New LWP 18291]
[New LWP 18292]
[New LWP 18293]
[New LWP 18294]
[New LWP 18295]
[New LWP 18296]
[New LWP 18297]
[New LWP 18298]
[New LWP 18299]
[New LWP 18300]
[New LWP 18301]
[New LWP 18302]
[New LWP 18303]
[New LWP 18304]
[New LWP 18305]
[New LWP 18306]
[New LWP 18307]
[New LWP 18308]
[New LWP 18309]
[New LWP 18310]
[New LWP 18311]
[New LWP 18312]
[New LWP 18313]
[New LWP 18314]
[New LWP 18315]
[New LWP 18316]
[New LWP 18317]
[New LWP 18318]
[New LWP 18319]
[New LWP 18320]
0x00007f8c00af3d50 in ?? ()
Id Target Id Frame
* 1 LWP 18189 "kudu" 0x00007f8c00af3d50 in ?? ()
2 LWP 18190 "kudu" 0x00007f8c00aeffb9 in ?? ()
3 LWP 18191 "kudu" 0x00007f8c00aeffb9 in ?? ()
4 LWP 18192 "kudu" 0x00007f8c00aeffb9 in ?? ()
5 LWP 18193 "kernel-watcher-" 0x00007f8c00aeffb9 in ?? ()
6 LWP 18200 "ntp client-1820" 0x00007f8c00af39e2 in ?? ()
7 LWP 18201 "file cache-evic" 0x00007f8c00aeffb9 in ?? ()
8 LWP 18202 "sq_acceptor" 0x00007f8bfeb95bb9 in ?? ()
9 LWP 18205 "rpc reactor-182" 0x00007f8bfeba2947 in ?? ()
10 LWP 18206 "rpc reactor-182" 0x00007f8bfeba2947 in ?? ()
11 LWP 18207 "rpc reactor-182" 0x00007f8bfeba2947 in ?? ()
12 LWP 18208 "rpc reactor-182" 0x00007f8bfeba2947 in ?? ()
13 LWP 18209 "MaintenanceMgr " 0x00007f8c00aefad3 in ?? ()
14 LWP 18210 "txn-status-mana" 0x00007f8c00aeffb9 in ?? ()
15 LWP 18212 "collect_and_rem" 0x00007f8c00aeffb9 in ?? ()
16 LWP 18213 "tc-session-exp-" 0x00007f8c00aeffb9 in ?? ()
17 LWP 18214 "rpc worker-1821" 0x00007f8c00aefad3 in ?? ()
18 LWP 18215 "rpc worker-1821" 0x00007f8c00aefad3 in ?? ()
19 LWP 18216 "rpc worker-1821" 0x00007f8c00aefad3 in ?? ()
20 LWP 18217 "rpc worker-1821" 0x00007f8c00aefad3 in ?? ()
21 LWP 18218 "rpc worker-1821" 0x00007f8c00aefad3 in ?? ()
22 LWP 18219 "rpc worker-1821" 0x00007f8c00aefad3 in ?? ()
23 LWP 18220 "rpc worker-1822" 0x00007f8c00aefad3 in ?? ()
24 LWP 18221 "rpc worker-1822" 0x00007f8c00aefad3 in ?? ()
25 LWP 18222 "rpc worker-1822" 0x00007f8c00aefad3 in ?? ()
26 LWP 18223 "rpc worker-1822" 0x00007f8c00aefad3 in ?? ()
27 LWP 18224 "rpc worker-1822" 0x00007f8c00aefad3 in ?? ()
28 LWP 18225 "rpc worker-1822" 0x00007f8c00aefad3 in ?? ()
29 LWP 18226 "rpc worker-1822" 0x00007f8c00aefad3 in ?? ()
30 LWP 18227 "rpc worker-1822" 0x00007f8c00aefad3 in ?? ()
31 LWP 18228 "rpc worker-1822" 0x00007f8c00aefad3 in ?? ()
32 LWP 18229 "rpc worker-1822" 0x00007f8c00aefad3 in ?? ()
33 LWP 18230 "rpc worker-1823" 0x00007f8c00aefad3 in ?? ()
34 LWP 18231 "rpc worker-1823" 0x00007f8c00aefad3 in ?? ()
35 LWP 18232 "rpc worker-1823" 0x00007f8c00aefad3 in ?? ()
36 LWP 18233 "rpc worker-1823" 0x00007f8c00aefad3 in ?? ()
37 LWP 18234 "rpc worker-1823" 0x00007f8c00aefad3 in ?? ()
38 LWP 18235 "rpc worker-1823" 0x00007f8c00aefad3 in ?? ()
39 LWP 18236 "rpc worker-1823" 0x00007f8c00aefad3 in ?? ()
40 LWP 18237 "rpc worker-1823" 0x00007f8c00aefad3 in ?? ()
41 LWP 18238 "rpc worker-1823" 0x00007f8c00aefad3 in ?? ()
42 LWP 18239 "rpc worker-1823" 0x00007f8c00aefad3 in ?? ()
43 LWP 18240 "rpc worker-1824" 0x00007f8c00aefad3 in ?? ()
44 LWP 18241 "rpc worker-1824" 0x00007f8c00aefad3 in ?? ()
45 LWP 18242 "rpc worker-1824" 0x00007f8c00aefad3 in ?? ()
46 LWP 18243 "rpc worker-1824" 0x00007f8c00aefad3 in ?? ()
47 LWP 18244 "rpc worker-1824" 0x00007f8c00aefad3 in ?? ()
48 LWP 18245 "rpc worker-1824" 0x00007f8c00aefad3 in ?? ()
49 LWP 18246 "rpc worker-1824" 0x00007f8c00aefad3 in ?? ()
50 LWP 18247 "rpc worker-1824" 0x00007f8c00aefad3 in ?? ()
51 LWP 18248 "rpc worker-1824" 0x00007f8c00aefad3 in ?? ()
52 LWP 18249 "rpc worker-1824" 0x00007f8c00aefad3 in ?? ()
53 LWP 18250 "rpc worker-1825" 0x00007f8c00aefad3 in ?? ()
54 LWP 18251 "rpc worker-1825" 0x00007f8c00aefad3 in ?? ()
55 LWP 18252 "rpc worker-1825" 0x00007f8c00aefad3 in ?? ()
56 LWP 18253 "rpc worker-1825" 0x00007f8c00aefad3 in ?? ()
57 LWP 18254 "rpc worker-1825" 0x00007f8c00aefad3 in ?? ()
58 LWP 18255 "rpc worker-1825" 0x00007f8c00aefad3 in ?? ()
59 LWP 18256 "rpc worker-1825" 0x00007f8c00aefad3 in ?? ()
60 LWP 18257 "rpc worker-1825" 0x00007f8c00aefad3 in ?? ()
61 LWP 18258 "rpc worker-1825" 0x00007f8c00aefad3 in ?? ()
62 LWP 18259 "rpc worker-1825" 0x00007f8c00aefad3 in ?? ()
63 LWP 18260 "rpc worker-1826" 0x00007f8c00aefad3 in ?? ()
64 LWP 18261 "rpc worker-1826" 0x00007f8c00aefad3 in ?? ()
65 LWP 18262 "rpc worker-1826" 0x00007f8c00aefad3 in ?? ()
66 LWP 18263 "rpc worker-1826" 0x00007f8c00aefad3 in ?? ()
67 LWP 18264 "rpc worker-1826" 0x00007f8c00aefad3 in ?? ()
68 LWP 18265 "rpc worker-1826" 0x00007f8c00aefad3 in ?? ()
69 LWP 18266 "rpc worker-1826" 0x00007f8c00aefad3 in ?? ()
70 LWP 18267 "rpc worker-1826" 0x00007f8c00aefad3 in ?? ()
71 LWP 18268 "rpc worker-1826" 0x00007f8c00aefad3 in ?? ()
72 LWP 18269 "rpc worker-1826" 0x00007f8c00aefad3 in ?? ()
73 LWP 18270 "rpc worker-1827" 0x00007f8c00aefad3 in ?? ()
74 LWP 18271 "rpc worker-1827" 0x00007f8c00aefad3 in ?? ()
75 LWP 18272 "rpc worker-1827" 0x00007f8c00aefad3 in ?? ()
76 LWP 18273 "rpc worker-1827" 0x00007f8c00aefad3 in ?? ()
77 LWP 18274 "rpc worker-1827" 0x00007f8c00aefad3 in ?? ()
78 LWP 18275 "rpc worker-1827" 0x00007f8c00aefad3 in ?? ()
79 LWP 18276 "rpc worker-1827" 0x00007f8c00aefad3 in ?? ()
80 LWP 18277 "rpc worker-1827" 0x00007f8c00aefad3 in ?? ()
81 LWP 18278 "rpc worker-1827" 0x00007f8c00aefad3 in ?? ()
82 LWP 18279 "rpc worker-1827" 0x00007f8c00aefad3 in ?? ()
83 LWP 18280 "rpc worker-1828" 0x00007f8c00aefad3 in ?? ()
84 LWP 18281 "rpc worker-1828" 0x00007f8c00aefad3 in ?? ()
85 LWP 18282 "rpc worker-1828" 0x00007f8c00aefad3 in ?? ()
86 LWP 18283 "rpc worker-1828" 0x00007f8c00aefad3 in ?? ()
87 LWP 18284 "rpc worker-1828" 0x00007f8c00aefad3 in ?? ()
88 LWP 18285 "rpc worker-1828" 0x00007f8c00aefad3 in ?? ()
89 LWP 18286 "rpc worker-1828" 0x00007f8c00aefad3 in ?? ()
90 LWP 18287 "rpc worker-1828" 0x00007f8c00aefad3 in ?? ()
91 LWP 18288 "rpc worker-1828" 0x00007f8c00aefad3 in ?? ()
92 LWP 18289 "rpc worker-1828" 0x00007f8c00aefad3 in ?? ()
93 LWP 18290 "rpc worker-1829" 0x00007f8c00aefad3 in ?? ()
94 LWP 18291 "rpc worker-1829" 0x00007f8c00aefad3 in ?? ()
95 LWP 18292 "rpc worker-1829" 0x00007f8c00aefad3 in ?? ()
96 LWP 18293 "rpc worker-1829" 0x00007f8c00aefad3 in ?? ()
97 LWP 18294 "rpc worker-1829" 0x00007f8c00aefad3 in ?? ()
98 LWP 18295 "rpc worker-1829" 0x00007f8c00aefad3 in ?? ()
99 LWP 18296 "rpc worker-1829" 0x00007f8c00aefad3 in ?? ()
100 LWP 18297 "rpc worker-1829" 0x00007f8c00aefad3 in ?? ()
101 LWP 18298 "rpc worker-1829" 0x00007f8c00aefad3 in ?? ()
102 LWP 18299 "rpc worker-1829" 0x00007f8c00aefad3 in ?? ()
103 LWP 18300 "rpc worker-1830" 0x00007f8c00aefad3 in ?? ()
104 LWP 18301 "rpc worker-1830" 0x00007f8c00aefad3 in ?? ()
105 LWP 18302 "rpc worker-1830" 0x00007f8c00aefad3 in ?? ()
106 LWP 18303 "rpc worker-1830" 0x00007f8c00aefad3 in ?? ()
107 LWP 18304 "rpc worker-1830" 0x00007f8c00aefad3 in ?? ()
108 LWP 18305 "rpc worker-1830" 0x00007f8c00aefad3 in ?? ()
109 LWP 18306 "rpc worker-1830" 0x00007f8c00aefad3 in ?? ()
110 LWP 18307 "rpc worker-1830" 0x00007f8c00aefad3 in ?? ()
111 LWP 18308 "rpc worker-1830" 0x00007f8c00aefad3 in ?? ()
112 LWP 18309 "rpc worker-1830" 0x00007f8c00aefad3 in ?? ()
113 LWP 18310 "rpc worker-1831" 0x00007f8c00aefad3 in ?? ()
114 LWP 18311 "rpc worker-1831" 0x00007f8c00aefad3 in ?? ()
115 LWP 18312 "rpc worker-1831" 0x00007f8c00aefad3 in ?? ()
116 LWP 18313 "rpc worker-1831" 0x00007f8c00aefad3 in ?? ()
117 LWP 18314 "diag-logger-183" 0x00007f8c00aeffb9 in ?? ()
118 LWP 18315 "result-tracker-" 0x00007f8c00aeffb9 in ?? ()
119 LWP 18316 "excess-log-dele" 0x00007f8c00aeffb9 in ?? ()
120 LWP 18317 "tcmalloc-memory" 0x00007f8c00aeffb9 in ?? ()
121 LWP 18318 "acceptor-18318" 0x00007f8bfeba3fc7 in ?? ()
122 LWP 18319 "heartbeat-18319" 0x00007f8c00aeffb9 in ?? ()
123 LWP 18320 "maintenance_sch" 0x00007f8c00aeffb9 in ?? ()
Thread 123 (LWP 18320):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000023 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000561ede6a7e50 in ?? ()
#5 0x00007f8bb77ad470 in ?? ()
#6 0x0000000000000046 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 18319):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000000b in ?? ()
#3 0x0000000100000081 in ?? ()
#4 0x0000561ede5f9634 in ?? ()
#5 0x00007f8bb7fae3f0 in ?? ()
#6 0x0000000000000017 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x00007f8bb7fae410 in ?? ()
#9 0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007f8bb7fae470 in ?? ()
#12 0x00007f8c0072f8d1 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 121 (LWP 18318):
#0 0x00007f8bfeba3fc7 in ?? ()
#1 0x00007f8bb87af020 in ?? ()
#2 0x00007f8c00740672 in ?? ()
#3 0x00007f8c0055f060 in ?? ()
#4 0x0000000000080800 in ?? ()
#5 0x00007f8bb87af3e0 in ?? ()
#6 0x00007f8bb87af090 in ?? ()
#7 0x0000561ede5b2978 in ?? ()
#8 0x00007f8c007461c9 in ?? ()
#9 0x00007f8bb87af510 in ?? ()
#10 0x00007f8bb87af700 in ?? ()
#11 0x0000008000000005 in ?? ()
#12 0x00007f8bfdb725f9 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 120 (LWP 18317):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffdfa6ff080 in ?? ()
#5 0x00007f8bb8fb0670 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 18316):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 118 (LWP 18315):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000561ede52a3e0 in ?? ()
#5 0x00007f8bb9fb2680 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 18314):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000561ede8ad390 in ?? ()
#5 0x00007f8bba7b3550 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 18313):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000008 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000561ede87f338 in ?? ()
#4 0x00007f8bbafb45c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8bbafb45e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 115 (LWP 18312):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 114 (LWP 18311):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 18310):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 18309):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 18308):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 18307):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 18306):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 18305):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 18304):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 18303):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 18302):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 18301):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 18300):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 18299):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 18298):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 18297):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 18296):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 18295):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 18294):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 18293):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 18292):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 18291):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 18290):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 18289):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 18288):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 18287):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 18286):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 18285):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 18284):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 18283):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 18282):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 18281):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 18280):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 18279):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 18278):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 18277):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 18276):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 18275):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 18274):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 18273):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x000000000000018a in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000561ede86bd38 in ?? ()
#4 0x00007f8bcefdc5c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8bcefdc5e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 75 (LWP 18272):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000378 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000561ede86bcb8 in ?? ()
#4 0x00007f8bcf7dd5c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8bcf7dd5e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 74 (LWP 18271):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 18270):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 18269):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 18268):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 18267):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 18266):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 18265):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 18264):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 18263):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 18262):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 18261):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 18260):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 18259):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 18258):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 18257):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 18256):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 18255):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 18254):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 18253):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000561ede86b238 in ?? ()
#4 0x00007f8bd8ff05c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8bd8ff05e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 55 (LWP 18252):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 18251):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 18250):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 18249):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 18248):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 18247):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 18246):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 18245):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 18244):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 18243):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 18242):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 18241):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 18240):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 18239):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 18238):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 18237):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 18236):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 18235):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 18234):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 36 (LWP 18233):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000001b4e in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000561ede86a738 in ?? ()
#4 0x00007f8be30045c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8be30045e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 35 (LWP 18232):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000001608 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000561ede86a6b8 in ?? ()
#4 0x00007f8be38055c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8be38055e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 34 (LWP 18231):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000001a56 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000561ede86a638 in ?? ()
#4 0x00007f8be40065c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8be40065e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 33 (LWP 18230):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000872 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000561ede86a5b8 in ?? ()
#4 0x00007f8be48075c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8be48075e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 32 (LWP 18229):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 31 (LWP 18228):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 30 (LWP 18227):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 18226):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 18225):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 18224):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 18223):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 18222):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 18221):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 18220):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 18219):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 18218):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 18217):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 18216):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 18215):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 18214):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 18213):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 15 (LWP 18212):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000561ede510b88 in ?? ()
#5 0x00007f8bed8196a0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 18210):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 13 (LWP 18209):
#0 0x00007f8c00aefad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 18208):
#0 0x00007f8bfeba2947 in ?? ()
#1 0x00007f8bef81d680 in ?? ()
#2 0x00007f8bfa128571 in ?? ()
#3 0x00007f8bef81d680 in ?? ()
#4 0x0000561ede60b398 in ?? ()
#5 0x00007f8bef81d6c0 in ?? ()
#6 0x00007f8bef81d840 in ?? ()
#7 0x0000561ede6b73f0 in ?? ()
#8 0x00007f8bfa12a25d in ?? ()
#9 0x3fb953ac06720000 in ?? ()
#10 0x0000561ede5fcc00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561ede5fcc00 in ?? ()
#13 0x00000000de60b398 in ?? ()
#14 0x0000561e00000000 in ?? ()
#15 0x41da7d802edba1cc in ?? ()
#16 0x0000561ede6b73f0 in ?? ()
#17 0x00007f8bef81d720 in ?? ()
#18 0x00007f8bfa12eba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb953ac06720000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 18207):
#0 0x00007f8bfeba2947 in ?? ()
#1 0x00007f8bf001e680 in ?? ()
#2 0x00007f8bfa128571 in ?? ()
#3 0x00007f8bf001e680 in ?? ()
#4 0x0000561ede60b018 in ?? ()
#5 0x00007f8bf001e6c0 in ?? ()
#6 0x00007f8bf001e840 in ?? ()
#7 0x0000561ede6b73f0 in ?? ()
#8 0x00007f8bfa12a25d in ?? ()
#9 0x3fb96b2c7c5f4000 in ?? ()
#10 0x0000561ede5fc100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561ede5fc100 in ?? ()
#13 0x00000000de60b018 in ?? ()
#14 0x0000561e00000000 in ?? ()
#15 0x41da7d802edba1c9 in ?? ()
#16 0x0000561ede6b73f0 in ?? ()
#17 0x00007f8bf001e720 in ?? ()
#18 0x00007f8bfa12eba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb96b2c7c5f4000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 18206):
#0 0x00007f8bfeba2947 in ?? ()
#1 0x00007f8bf081f680 in ?? ()
#2 0x00007f8bfa128571 in ?? ()
#3 0x00007f8bf081f680 in ?? ()
#4 0x0000561ede60b558 in ?? ()
#5 0x00007f8bf081f6c0 in ?? ()
#6 0x00007f8bf081f840 in ?? ()
#7 0x0000561ede6b73f0 in ?? ()
#8 0x00007f8bfa12a25d in ?? ()
#9 0x3fb9518edbfb0000 in ?? ()
#10 0x0000561ede5fc680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561ede5fc680 in ?? ()
#13 0x00000000de60b558 in ?? ()
#14 0x0000561e00000000 in ?? ()
#15 0x41da7d802edba1c9 in ?? ()
#16 0x0000561ede6b73f0 in ?? ()
#17 0x00007f8bf081f720 in ?? ()
#18 0x00007f8bfa12eba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb9518edbfb0000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 9 (LWP 18205):
#0 0x00007f8bfeba2947 in ?? ()
#1 0x00007f8bf2403680 in ?? ()
#2 0x00007f8bfa128571 in ?? ()
#3 0x00007f8bf2403680 in ?? ()
#4 0x0000561ede60ae58 in ?? ()
#5 0x00007f8bf24036c0 in ?? ()
#6 0x00007f8bf2403840 in ?? ()
#7 0x0000561ede6b73f0 in ?? ()
#8 0x00007f8bfa12a25d in ?? ()
#9 0x3fb94c1f057e4000 in ?? ()
#10 0x0000561ede5fbb80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561ede5fbb80 in ?? ()
#13 0x00000000de60ae58 in ?? ()
#14 0x0000561e00000000 in ?? ()
#15 0x41da7d802edba1cc in ?? ()
#16 0x0000561ede6b73f0 in ?? ()
#17 0x00007f8bf2403720 in ?? ()
#18 0x00007f8bfa12eba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 18202):
#0 0x00007f8bfeb95bb9 in ?? ()
#1 0x00007f8bf3c06840 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 18201):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 18200):
#0 0x00007f8c00af39e2 in ?? ()
#1 0x0000561ede52bee0 in ?? ()
#2 0x00007f8bf2c044d0 in ?? ()
#3 0x00007f8bf2c04450 in ?? ()
#4 0x00007f8bf2c04570 in ?? ()
#5 0x00007f8bf2c04790 in ?? ()
#6 0x00007f8bf2c047a0 in ?? ()
#7 0x00007f8bf2c044e0 in ?? ()
#8 0x00007f8bf2c044d0 in ?? ()
#9 0x0000561ede52bc80 in ?? ()
#10 0x00007f8c010ff97f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 18193):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000002d in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000561ede6b14c8 in ?? ()
#5 0x00007f8bf4c08430 in ?? ()
#6 0x000000000000005a in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 18192):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000561ede510848 in ?? ()
#5 0x00007f8bf5409790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 18191):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000561ede5102a8 in ?? ()
#5 0x00007f8bf5c0a790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 18190):
#0 0x00007f8c00aeffb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000561ede510188 in ?? ()
#5 0x00007f8bf640b790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 18189):
#0 0x00007f8c00af3d50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260502 14:07:08.365523 12921 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID fcb69d1fc3094c95bd74e18f784e388d and pid 18456
************************ BEGIN STACKS **************************
[New LWP 18459]
[New LWP 18460]
[New LWP 18461]
[New LWP 18462]
[New LWP 18468]
[New LWP 18469]
[New LWP 18470]
[New LWP 18473]
[New LWP 18474]
[New LWP 18475]
[New LWP 18476]
[New LWP 18477]
[New LWP 18478]
[New LWP 18480]
[New LWP 18481]
[New LWP 18482]
[New LWP 18483]
[New LWP 18484]
[New LWP 18485]
[New LWP 18486]
[New LWP 18487]
[New LWP 18488]
[New LWP 18489]
[New LWP 18490]
[New LWP 18491]
[New LWP 18492]
[New LWP 18493]
[New LWP 18494]
[New LWP 18495]
[New LWP 18496]
[New LWP 18497]
[New LWP 18498]
[New LWP 18499]
[New LWP 18500]
[New LWP 18501]
[New LWP 18502]
[New LWP 18503]
[New LWP 18504]
[New LWP 18505]
[New LWP 18506]
[New LWP 18507]
[New LWP 18508]
[New LWP 18509]
[New LWP 18510]
[New LWP 18511]
[New LWP 18512]
[New LWP 18513]
[New LWP 18514]
[New LWP 18515]
[New LWP 18516]
[New LWP 18517]
[New LWP 18518]
[New LWP 18519]
[New LWP 18520]
[New LWP 18521]
[New LWP 18522]
[New LWP 18523]
[New LWP 18524]
[New LWP 18525]
[New LWP 18526]
[New LWP 18527]
[New LWP 18528]
[New LWP 18529]
[New LWP 18530]
[New LWP 18531]
[New LWP 18532]
[New LWP 18533]
[New LWP 18534]
[New LWP 18535]
[New LWP 18536]
[New LWP 18537]
[New LWP 18538]
[New LWP 18539]
[New LWP 18540]
[New LWP 18541]
[New LWP 18542]
[New LWP 18543]
[New LWP 18544]
[New LWP 18545]
[New LWP 18546]
[New LWP 18547]
[New LWP 18548]
[New LWP 18549]
[New LWP 18550]
[New LWP 18551]
[New LWP 18552]
[New LWP 18553]
[New LWP 18554]
[New LWP 18555]
[New LWP 18556]
[New LWP 18557]
[New LWP 18558]
[New LWP 18559]
[New LWP 18560]
[New LWP 18561]
[New LWP 18562]
[New LWP 18563]
[New LWP 18564]
[New LWP 18565]
[New LWP 18566]
[New LWP 18567]
[New LWP 18568]
[New LWP 18569]
[New LWP 18570]
[New LWP 18571]
[New LWP 18572]
[New LWP 18573]
[New LWP 18574]
[New LWP 18575]
[New LWP 18576]
[New LWP 18577]
[New LWP 18578]
[New LWP 18579]
[New LWP 18580]
[New LWP 18581]
[New LWP 18582]
[New LWP 18583]
[New LWP 18584]
[New LWP 18585]
[New LWP 18586]
[New LWP 18587]
[New LWP 18588]
0x00007f9bbe848d50 in ?? ()
Id Target Id Frame
* 1 LWP 18456 "kudu" 0x00007f9bbe848d50 in ?? ()
2 LWP 18459 "kudu" 0x00007f9bbe844fb9 in ?? ()
3 LWP 18460 "kudu" 0x00007f9bbe844fb9 in ?? ()
4 LWP 18461 "kudu" 0x00007f9bbe844fb9 in ?? ()
5 LWP 18462 "kernel-watcher-" 0x00007f9bbe844fb9 in ?? ()
6 LWP 18468 "ntp client-1846" 0x00007f9bbe8489e2 in ?? ()
7 LWP 18469 "file cache-evic" 0x00007f9bbe844fb9 in ?? ()
8 LWP 18470 "sq_acceptor" 0x00007f9bbc8eabb9 in ?? ()
9 LWP 18473 "rpc reactor-184" 0x00007f9bbc8f7947 in ?? ()
10 LWP 18474 "rpc reactor-184" 0x00007f9bbc8f7947 in ?? ()
11 LWP 18475 "rpc reactor-184" 0x00007f9bbc8f7947 in ?? ()
12 LWP 18476 "rpc reactor-184" 0x00007f9bbc8f7947 in ?? ()
13 LWP 18477 "MaintenanceMgr " 0x00007f9bbe844ad3 in ?? ()
14 LWP 18478 "txn-status-mana" 0x00007f9bbe844fb9 in ?? ()
15 LWP 18480 "collect_and_rem" 0x00007f9bbe844fb9 in ?? ()
16 LWP 18481 "tc-session-exp-" 0x00007f9bbe844fb9 in ?? ()
17 LWP 18482 "rpc worker-1848" 0x00007f9bbe844ad3 in ?? ()
18 LWP 18483 "rpc worker-1848" 0x00007f9bbe844ad3 in ?? ()
19 LWP 18484 "rpc worker-1848" 0x00007f9bbe844ad3 in ?? ()
20 LWP 18485 "rpc worker-1848" 0x00007f9bbe844ad3 in ?? ()
21 LWP 18486 "rpc worker-1848" 0x00007f9bbe844ad3 in ?? ()
22 LWP 18487 "rpc worker-1848" 0x00007f9bbe844ad3 in ?? ()
23 LWP 18488 "rpc worker-1848" 0x00007f9bbe844ad3 in ?? ()
24 LWP 18489 "rpc worker-1848" 0x00007f9bbe844ad3 in ?? ()
25 LWP 18490 "rpc worker-1849" 0x00007f9bbe844ad3 in ?? ()
26 LWP 18491 "rpc worker-1849" 0x00007f9bbe844ad3 in ?? ()
27 LWP 18492 "rpc worker-1849" 0x00007f9bbe844ad3 in ?? ()
28 LWP 18493 "rpc worker-1849" 0x00007f9bbe844ad3 in ?? ()
29 LWP 18494 "rpc worker-1849" 0x00007f9bbe844ad3 in ?? ()
30 LWP 18495 "rpc worker-1849" 0x00007f9bbe844ad3 in ?? ()
31 LWP 18496 "rpc worker-1849" 0x00007f9bbe844ad3 in ?? ()
32 LWP 18497 "rpc worker-1849" 0x00007f9bbe844ad3 in ?? ()
33 LWP 18498 "rpc worker-1849" 0x00007f9bbe844ad3 in ?? ()
34 LWP 18499 "rpc worker-1849" 0x00007f9bbe844ad3 in ?? ()
35 LWP 18500 "rpc worker-1850" 0x00007f9bbe844ad3 in ?? ()
36 LWP 18501 "rpc worker-1850" 0x00007f9bbe844ad3 in ?? ()
37 LWP 18502 "rpc worker-1850" 0x00007f9bbe844ad3 in ?? ()
38 LWP 18503 "rpc worker-1850" 0x00007f9bbe844ad3 in ?? ()
39 LWP 18504 "rpc worker-1850" 0x00007f9bbe844ad3 in ?? ()
40 LWP 18505 "rpc worker-1850" 0x00007f9bbe844ad3 in ?? ()
41 LWP 18506 "rpc worker-1850" 0x00007f9bbe844ad3 in ?? ()
42 LWP 18507 "rpc worker-1850" 0x00007f9bbe844ad3 in ?? ()
43 LWP 18508 "rpc worker-1850" 0x00007f9bbe844ad3 in ?? ()
44 LWP 18509 "rpc worker-1850" 0x00007f9bbe844ad3 in ?? ()
45 LWP 18510 "rpc worker-1851" 0x00007f9bbe844ad3 in ?? ()
46 LWP 18511 "rpc worker-1851" 0x00007f9bbe844ad3 in ?? ()
47 LWP 18512 "rpc worker-1851" 0x00007f9bbe844ad3 in ?? ()
48 LWP 18513 "rpc worker-1851" 0x00007f9bbe844ad3 in ?? ()
49 LWP 18514 "rpc worker-1851" 0x00007f9bbe844ad3 in ?? ()
50 LWP 18515 "rpc worker-1851" 0x00007f9bbe844ad3 in ?? ()
51 LWP 18516 "rpc worker-1851" 0x00007f9bbe844ad3 in ?? ()
52 LWP 18517 "rpc worker-1851" 0x00007f9bbe844ad3 in ?? ()
53 LWP 18518 "rpc worker-1851" 0x00007f9bbe844ad3 in ?? ()
54 LWP 18519 "rpc worker-1851" 0x00007f9bbe844ad3 in ?? ()
55 LWP 18520 "rpc worker-1852" 0x00007f9bbe844ad3 in ?? ()
56 LWP 18521 "rpc worker-1852" 0x00007f9bbe844ad3 in ?? ()
57 LWP 18522 "rpc worker-1852" 0x00007f9bbe844ad3 in ?? ()
58 LWP 18523 "rpc worker-1852" 0x00007f9bbe844ad3 in ?? ()
59 LWP 18524 "rpc worker-1852" 0x00007f9bbe844ad3 in ?? ()
60 LWP 18525 "rpc worker-1852" 0x00007f9bbe844ad3 in ?? ()
61 LWP 18526 "rpc worker-1852" 0x00007f9bbe844ad3 in ?? ()
62 LWP 18527 "rpc worker-1852" 0x00007f9bbe844ad3 in ?? ()
63 LWP 18528 "rpc worker-1852" 0x00007f9bbe844ad3 in ?? ()
64 LWP 18529 "rpc worker-1852" 0x00007f9bbe844ad3 in ?? ()
65 LWP 18530 "rpc worker-1853" 0x00007f9bbe844ad3 in ?? ()
66 LWP 18531 "rpc worker-1853" 0x00007f9bbe844ad3 in ?? ()
67 LWP 18532 "rpc worker-1853" 0x00007f9bbe844ad3 in ?? ()
68 LWP 18533 "rpc worker-1853" 0x00007f9bbe844ad3 in ?? ()
69 LWP 18534 "rpc worker-1853" 0x00007f9bbe844ad3 in ?? ()
70 LWP 18535 "rpc worker-1853" 0x00007f9bbe844ad3 in ?? ()
71 LWP 18536 "rpc worker-1853" 0x00007f9bbe844ad3 in ?? ()
72 LWP 18537 "rpc worker-1853" 0x00007f9bbe844ad3 in ?? ()
73 LWP 18538 "rpc worker-1853" 0x00007f9bbe844ad3 in ?? ()
74 LWP 18539 "rpc worker-1853" 0x00007f9bbe844ad3 in ?? ()
75 LWP 18540 "rpc worker-1854" 0x00007f9bbe844ad3 in ?? ()
76 LWP 18541 "rpc worker-1854" 0x00007f9bbe844ad3 in ?? ()
77 LWP 18542 "rpc worker-1854" 0x00007f9bbe844ad3 in ?? ()
78 LWP 18543 "rpc worker-1854" 0x00007f9bbe844ad3 in ?? ()
79 LWP 18544 "rpc worker-1854" 0x00007f9bbe844ad3 in ?? ()
80 LWP 18545 "rpc worker-1854" 0x00007f9bbe844ad3 in ?? ()
81 LWP 18546 "rpc worker-1854" 0x00007f9bbe844ad3 in ?? ()
82 LWP 18547 "rpc worker-1854" 0x00007f9bbe844ad3 in ?? ()
83 LWP 18548 "rpc worker-1854" 0x00007f9bbe844ad3 in ?? ()
84 LWP 18549 "rpc worker-1854" 0x00007f9bbe844ad3 in ?? ()
85 LWP 18550 "rpc worker-1855" 0x00007f9bbe844ad3 in ?? ()
86 LWP 18551 "rpc worker-1855" 0x00007f9bbe844ad3 in ?? ()
87 LWP 18552 "rpc worker-1855" 0x00007f9bbe844ad3 in ?? ()
88 LWP 18553 "rpc worker-1855" 0x00007f9bbe844ad3 in ?? ()
89 LWP 18554 "rpc worker-1855" 0x00007f9bbe844ad3 in ?? ()
90 LWP 18555 "rpc worker-1855" 0x00007f9bbe844ad3 in ?? ()
91 LWP 18556 "rpc worker-1855" 0x00007f9bbe844ad3 in ?? ()
92 LWP 18557 "rpc worker-1855" 0x00007f9bbe844ad3 in ?? ()
93 LWP 18558 "rpc worker-1855" 0x00007f9bbe844ad3 in ?? ()
94 LWP 18559 "rpc worker-1855" 0x00007f9bbe844ad3 in ?? ()
95 LWP 18560 "rpc worker-1856" 0x00007f9bbe844ad3 in ?? ()
96 LWP 18561 "rpc worker-1856" 0x00007f9bbe844ad3 in ?? ()
97 LWP 18562 "rpc worker-1856" 0x00007f9bbe844ad3 in ?? ()
98 LWP 18563 "rpc worker-1856" 0x00007f9bbe844ad3 in ?? ()
99 LWP 18564 "rpc worker-1856" 0x00007f9bbe844ad3 in ?? ()
100 LWP 18565 "rpc worker-1856" 0x00007f9bbe844ad3 in ?? ()
101 LWP 18566 "rpc worker-1856" 0x00007f9bbe844ad3 in ?? ()
102 LWP 18567 "rpc worker-1856" 0x00007f9bbe844ad3 in ?? ()
103 LWP 18568 "rpc worker-1856" 0x00007f9bbe844ad3 in ?? ()
104 LWP 18569 "rpc worker-1856" 0x00007f9bbe844ad3 in ?? ()
105 LWP 18570 "rpc worker-1857" 0x00007f9bbe844ad3 in ?? ()
106 LWP 18571 "rpc worker-1857" 0x00007f9bbe844ad3 in ?? ()
107 LWP 18572 "rpc worker-1857" 0x00007f9bbe844ad3 in ?? ()
108 LWP 18573 "rpc worker-1857" 0x00007f9bbe844ad3 in ?? ()
109 LWP 18574 "rpc worker-1857" 0x00007f9bbe844ad3 in ?? ()
110 LWP 18575 "rpc worker-1857" 0x00007f9bbe844ad3 in ?? ()
111 LWP 18576 "rpc worker-1857" 0x00007f9bbe844ad3 in ?? ()
112 LWP 18577 "rpc worker-1857" 0x00007f9bbe844ad3 in ?? ()
113 LWP 18578 "rpc worker-1857" 0x00007f9bbe844ad3 in ?? ()
114 LWP 18579 "rpc worker-1857" 0x00007f9bbe844ad3 in ?? ()
115 LWP 18580 "rpc worker-1858" 0x00007f9bbe844ad3 in ?? ()
116 LWP 18581 "rpc worker-1858" 0x00007f9bbe844ad3 in ?? ()
117 LWP 18582 "diag-logger-185" 0x00007f9bbe844fb9 in ?? ()
118 LWP 18583 "result-tracker-" 0x00007f9bbe844fb9 in ?? ()
119 LWP 18584 "excess-log-dele" 0x00007f9bbe844fb9 in ?? ()
120 LWP 18585 "tcmalloc-memory" 0x00007f9bbe844fb9 in ?? ()
121 LWP 18586 "acceptor-18586" 0x00007f9bbc8f8fc7 in ?? ()
122 LWP 18587 "heartbeat-18587" 0x00007f9bbe844fb9 in ?? ()
123 LWP 18588 "maintenance_sch" 0x00007f9bbe844fb9 in ?? ()
Thread 123 (LWP 18588):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000024 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055fa96095e50 in ?? ()
#5 0x00007f9b75502470 in ?? ()
#6 0x0000000000000048 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 18587):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000000b in ?? ()
#3 0x0000000100000081 in ?? ()
#4 0x000055fa95fe7634 in ?? ()
#5 0x00007f9b75d033f0 in ?? ()
#6 0x0000000000000017 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x00007f9b75d03410 in ?? ()
#9 0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007f9b75d03470 in ?? ()
#12 0x00007f9bbe4848d1 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 121 (LWP 18586):
#0 0x00007f9bbc8f8fc7 in ?? ()
#1 0x00007f9b76504020 in ?? ()
#2 0x00007f9bbe495672 in ?? ()
#3 0x00007f9b76504020 in ?? ()
#4 0x0000000000080800 in ?? ()
#5 0x00007f9b765043e0 in ?? ()
#6 0x00007f9b76504090 in ?? ()
#7 0x000055fa95fa0978 in ?? ()
#8 0x00007f9bbe49b1c9 in ?? ()
#9 0x00007f9b76504510 in ?? ()
#10 0x00007f9b76504700 in ?? ()
#11 0x0000008000000005 in ?? ()
#12 0x00007f9bbb8c75f9 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 120 (LWP 18585):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffd07178710 in ?? ()
#5 0x00007f9b76d05670 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 18584):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 118 (LWP 18583):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000009 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055fa95f183e0 in ?? ()
#5 0x00007f9b77d07680 in ?? ()
#6 0x0000000000000012 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 18582):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000009 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055fa96294790 in ?? ()
#5 0x00007f9b78508550 in ?? ()
#6 0x0000000000000012 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 18581):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000007 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055fa962a933c in ?? ()
#4 0x00007f9b78d095c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9b78d095e0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055fa962a9328 in ?? ()
#9 0x00007f9bbe844770 in ?? ()
#10 0x00007f9b78d095e0 in ?? ()
#11 0x00007f9b78d09640 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 115 (LWP 18580):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055fa962a92bc in ?? ()
#4 0x00007f9b7950a5c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9b7950a5e0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055fa962a92a8 in ?? ()
#9 0x00007f9bbe844770 in ?? ()
#10 0x00007f9b7950a5e0 in ?? ()
#11 0x00007f9b7950a640 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 114 (LWP 18579):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 18578):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 18577):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 18576):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 18575):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 18574):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 18573):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 18572):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 18571):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 18570):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 18569):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 18568):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 18567):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 18566):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 18565):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 18564):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 18563):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 18562):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 18561):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 18560):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 18559):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 18558):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 18557):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 18556):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 18555):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 18554):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 18553):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 18552):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 18551):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 18550):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 18549):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 18548):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 18547):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 18546):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 18545):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 18544):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 18543):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 18542):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 18541):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x000000000000020a in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055fa9629ddb8 in ?? ()
#4 0x00007f9b8cd315c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9b8cd315e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 75 (LWP 18540):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000379 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055fa9629dd3c in ?? ()
#4 0x00007f9b8d5325c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9b8d5325e0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055fa9629dd28 in ?? ()
#9 0x00007f9bbe844770 in ?? ()
#10 0x00007f9b8d5325e0 in ?? ()
#11 0x00007f9b8d532640 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 74 (LWP 18539):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 18538):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 18537):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 18536):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 18535):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 18534):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 18533):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 18532):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 18531):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 18530):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 18529):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 18528):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 18527):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 18526):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 18525):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 18524):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 18523):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 18522):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 18521):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055fa9629d238 in ?? ()
#4 0x00007f9b96d455c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9b96d455e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 55 (LWP 18520):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 18519):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 18518):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 18517):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 18516):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 18515):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 18514):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 18513):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 18512):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 18511):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 18510):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 18509):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 18508):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 18507):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 18506):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 18505):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 18504):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 18503):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 18502):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 36 (LWP 18501):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055fa9629c73c in ?? ()
#4 0x00007f9ba0d595c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9ba0d595e0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055fa9629c728 in ?? ()
#9 0x00007f9bbe844770 in ?? ()
#10 0x00007f9ba0d595e0 in ?? ()
#11 0x00007f9ba0d59640 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 35 (LWP 18500):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055fa9629c6b8 in ?? ()
#4 0x00007f9ba155a5c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9ba155a5e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 34 (LWP 18499):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000032 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055fa9629c638 in ?? ()
#4 0x00007f9ba1d5b5c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9ba1d5b5e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 33 (LWP 18498):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 32 (LWP 18497):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 31 (LWP 18496):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 30 (LWP 18495):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 18494):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 18493):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 18492):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 18491):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 18490):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 18489):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 18488):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 18487):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 18486):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 18485):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 18484):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 18483):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 18482):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 18481):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 15 (LWP 18480):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055fa95efeb88 in ?? ()
#5 0x00007f9bab56e6a0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 18478):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 13 (LWP 18477):
#0 0x00007f9bbe844ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 18476):
#0 0x00007f9bbc8f7947 in ?? ()
#1 0x00007f9bad572680 in ?? ()
#2 0x00007f9bb7e7d571 in ?? ()
#3 0x00007f9bad572680 in ?? ()
#4 0x000055fa95ff9398 in ?? ()
#5 0x00007f9bad5726c0 in ?? ()
#6 0x00007f9bad572840 in ?? ()
#7 0x000055fa960a53f0 in ?? ()
#8 0x00007f9bb7e7f25d in ?? ()
#9 0x3fb953d116070000 in ?? ()
#10 0x000055fa95feac00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055fa95feac00 in ?? ()
#13 0x0000000095ff9398 in ?? ()
#14 0x000055fa00000000 in ?? ()
#15 0x41da7d802edba1c9 in ?? ()
#16 0x000055fa960a53f0 in ?? ()
#17 0x00007f9bad572720 in ?? ()
#18 0x00007f9bb7e83ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb953d116070000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 18475):
#0 0x00007f9bbc8f7947 in ?? ()
#1 0x00007f9badd73680 in ?? ()
#2 0x00007f9bb7e7d571 in ?? ()
#3 0x00007f9badd73680 in ?? ()
#4 0x000055fa95ff9018 in ?? ()
#5 0x00007f9badd736c0 in ?? ()
#6 0x00007f9badd73840 in ?? ()
#7 0x000055fa960a53f0 in ?? ()
#8 0x00007f9bb7e7f25d in ?? ()
#9 0x3fb953d970ebc000 in ?? ()
#10 0x000055fa95fe9600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055fa95fe9600 in ?? ()
#13 0x0000000095ff9018 in ?? ()
#14 0x000055fa00000000 in ?? ()
#15 0x41da7d802edba1cb in ?? ()
#16 0x000055fa960a53f0 in ?? ()
#17 0x00007f9badd73720 in ?? ()
#18 0x00007f9bb7e83ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb953d970ebc000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 18474):
#0 0x00007f9bbc8f7947 in ?? ()
#1 0x00007f9bae574680 in ?? ()
#2 0x00007f9bb7e7d571 in ?? ()
#3 0x00007f9bae574680 in ?? ()
#4 0x000055fa95ff9558 in ?? ()
#5 0x00007f9bae5746c0 in ?? ()
#6 0x00007f9bae574840 in ?? ()
#7 0x000055fa960a53f0 in ?? ()
#8 0x00007f9bb7e7f25d in ?? ()
#9 0x3fb2143523108000 in ?? ()
#10 0x000055fa95fe9b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055fa95fe9b80 in ?? ()
#13 0x0000000095ff9558 in ?? ()
#14 0x000055fa00000000 in ?? ()
#15 0x41da7d802edba1c9 in ?? ()
#16 0x000055fa960a53f0 in ?? ()
#17 0x00007f9bae574720 in ?? ()
#18 0x00007f9bb7e83ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb2143523108000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 9 (LWP 18473):
#0 0x00007f9bbc8f7947 in ?? ()
#1 0x00007f9bb0158680 in ?? ()
#2 0x00007f9bb7e7d571 in ?? ()
#3 0x00007f9bb0158680 in ?? ()
#4 0x000055fa95ff8e58 in ?? ()
#5 0x00007f9bb01586c0 in ?? ()
#6 0x00007f9bb0158840 in ?? ()
#7 0x000055fa960a53f0 in ?? ()
#8 0x00007f9bb7e7f25d in ?? ()
#9 0x3fb97f594048c000 in ?? ()
#10 0x000055fa95fea100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055fa95fea100 in ?? ()
#13 0x0000000095ff8e58 in ?? ()
#14 0x000055fa00000000 in ?? ()
#15 0x41da7d802edba1c8 in ?? ()
#16 0x000055fa960a53f0 in ?? ()
#17 0x00007f9bb0158720 in ?? ()
#18 0x00007f9bb7e83ba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 18470):
#0 0x00007f9bbc8eabb9 in ?? ()
#1 0x00007f9bb195b840 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 18469):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 18468):
#0 0x00007f9bbe8489e2 in ?? ()
#1 0x000055fa95f19ee0 in ?? ()
#2 0x00007f9bb09594d0 in ?? ()
#3 0x00007f9bb0959450 in ?? ()
#4 0x00007f9bb0959570 in ?? ()
#5 0x00007f9bb0959790 in ?? ()
#6 0x00007f9bb09597a0 in ?? ()
#7 0x00007f9bb09594e0 in ?? ()
#8 0x00007f9bb09594d0 in ?? ()
#9 0x000055fa95f19c80 in ?? ()
#10 0x00007f9bbee5497f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 18462):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000002d in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055fa9609f4c8 in ?? ()
#5 0x00007f9bb295d430 in ?? ()
#6 0x000000000000005a in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 18461):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055fa95efe848 in ?? ()
#5 0x00007f9bb315e790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 18460):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055fa95efe2a8 in ?? ()
#5 0x00007f9bb395f790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 18459):
#0 0x00007f9bbe844fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055fa95efe188 in ?? ()
#5 0x00007f9bb4160790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 18456):
#0 0x00007f9bbe848d50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260502 14:07:08.856070 12921 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 3 with UUID bd0a653794c34d9591e2d5c89c802493 and pid 18057
************************ BEGIN STACKS **************************
[New LWP 18058]
[New LWP 18059]
[New LWP 18060]
[New LWP 18061]
[New LWP 18067]
[New LWP 18068]
[New LWP 18069]
[New LWP 18072]
[New LWP 18073]
[New LWP 18074]
[New LWP 18075]
[New LWP 18076]
[New LWP 18077]
[New LWP 18078]
[New LWP 18079]
[New LWP 18080]
[New LWP 18081]
[New LWP 18082]
[New LWP 18083]
[New LWP 18084]
[New LWP 18085]
[New LWP 18086]
[New LWP 18087]
[New LWP 18088]
[New LWP 18089]
[New LWP 18090]
[New LWP 18091]
[New LWP 18092]
[New LWP 18093]
[New LWP 18094]
[New LWP 18095]
[New LWP 18096]
[New LWP 18097]
[New LWP 18098]
[New LWP 18099]
[New LWP 18100]
[New LWP 18101]
[New LWP 18102]
[New LWP 18103]
[New LWP 18104]
[New LWP 18105]
[New LWP 18106]
[New LWP 18107]
[New LWP 18108]
[New LWP 18109]
[New LWP 18110]
[New LWP 18111]
[New LWP 18112]
[New LWP 18113]
[New LWP 18114]
[New LWP 18115]
[New LWP 18116]
[New LWP 18117]
[New LWP 18118]
[New LWP 18119]
[New LWP 18120]
[New LWP 18121]
[New LWP 18122]
[New LWP 18123]
[New LWP 18124]
[New LWP 18125]
[New LWP 18126]
[New LWP 18127]
[New LWP 18128]
[New LWP 18129]
[New LWP 18130]
[New LWP 18131]
[New LWP 18132]
[New LWP 18133]
[New LWP 18134]
[New LWP 18135]
[New LWP 18136]
[New LWP 18137]
[New LWP 18138]
[New LWP 18139]
[New LWP 18140]
[New LWP 18141]
[New LWP 18142]
[New LWP 18143]
[New LWP 18144]
[New LWP 18145]
[New LWP 18146]
[New LWP 18147]
[New LWP 18148]
[New LWP 18149]
[New LWP 18150]
[New LWP 18151]
[New LWP 18152]
[New LWP 18153]
[New LWP 18154]
[New LWP 18155]
[New LWP 18156]
[New LWP 18157]
[New LWP 18158]
[New LWP 18159]
[New LWP 18160]
[New LWP 18161]
[New LWP 18162]
[New LWP 18163]
[New LWP 18164]
[New LWP 18165]
[New LWP 18166]
[New LWP 18167]
[New LWP 18168]
[New LWP 18169]
[New LWP 18170]
[New LWP 18171]
[New LWP 18172]
[New LWP 18173]
[New LWP 18174]
[New LWP 18175]
[New LWP 18176]
[New LWP 18177]
[New LWP 18178]
[New LWP 18179]
[New LWP 18180]
[New LWP 18181]
[New LWP 18182]
[New LWP 18183]
[New LWP 18184]
[New LWP 18185]
[New LWP 18186]
0x00007f4599be7d50 in ?? ()
Id Target Id Frame
* 1 LWP 18057 "kudu" 0x00007f4599be7d50 in ?? ()
2 LWP 18058 "kudu" 0x00007f4599be3fb9 in ?? ()
3 LWP 18059 "kudu" 0x00007f4599be3fb9 in ?? ()
4 LWP 18060 "kudu" 0x00007f4599be3fb9 in ?? ()
5 LWP 18061 "kernel-watcher-" 0x00007f4599be3fb9 in ?? ()
6 LWP 18067 "ntp client-1806" 0x00007f4599be79e2 in ?? ()
7 LWP 18068 "file cache-evic" 0x00007f4599be3fb9 in ?? ()
8 LWP 18069 "sq_acceptor" 0x00007f4597c89bb9 in ?? ()
9 LWP 18072 "rpc reactor-180" 0x00007f4597c96947 in ?? ()
10 LWP 18073 "rpc reactor-180" 0x00007f4597c96947 in ?? ()
11 LWP 18074 "rpc reactor-180" 0x00007f4597c96947 in ?? ()
12 LWP 18075 "rpc reactor-180" 0x00007f4597c96947 in ?? ()
13 LWP 18076 "MaintenanceMgr " 0x00007f4599be3ad3 in ?? ()
14 LWP 18077 "txn-status-mana" 0x00007f4599be3fb9 in ?? ()
15 LWP 18078 "collect_and_rem" 0x00007f4599be3fb9 in ?? ()
16 LWP 18079 "tc-session-exp-" 0x00007f4599be3fb9 in ?? ()
17 LWP 18080 "rpc worker-1808" 0x00007f4599be3ad3 in ?? ()
18 LWP 18081 "rpc worker-1808" 0x00007f4599be3ad3 in ?? ()
19 LWP 18082 "rpc worker-1808" 0x00007f4599be3ad3 in ?? ()
20 LWP 18083 "rpc worker-1808" 0x00007f4599be3ad3 in ?? ()
21 LWP 18084 "rpc worker-1808" 0x00007f4599be3ad3 in ?? ()
22 LWP 18085 "rpc worker-1808" 0x00007f4599be3ad3 in ?? ()
23 LWP 18086 "rpc worker-1808" 0x00007f4599be3ad3 in ?? ()
24 LWP 18087 "rpc worker-1808" 0x00007f4599be3ad3 in ?? ()
25 LWP 18088 "rpc worker-1808" 0x00007f4599be3ad3 in ?? ()
26 LWP 18089 "rpc worker-1808" 0x00007f4599be3ad3 in ?? ()
27 LWP 18090 "rpc worker-1809" 0x00007f4599be3ad3 in ?? ()
28 LWP 18091 "rpc worker-1809" 0x00007f4599be3ad3 in ?? ()
29 LWP 18092 "rpc worker-1809" 0x00007f4599be3ad3 in ?? ()
30 LWP 18093 "rpc worker-1809" 0x00007f4599be3ad3 in ?? ()
31 LWP 18094 "rpc worker-1809" 0x00007f4599be3ad3 in ?? ()
32 LWP 18095 "rpc worker-1809" 0x00007f4599be3ad3 in ?? ()
33 LWP 18096 "rpc worker-1809" 0x00007f4599be3ad3 in ?? ()
34 LWP 18097 "rpc worker-1809" 0x00007f4599be3ad3 in ?? ()
35 LWP 18098 "rpc worker-1809" 0x00007f4599be3ad3 in ?? ()
36 LWP 18099 "rpc worker-1809" 0x00007f4599be3ad3 in ?? ()
37 LWP 18100 "rpc worker-1810" 0x00007f4599be3ad3 in ?? ()
38 LWP 18101 "rpc worker-1810" 0x00007f4599be3ad3 in ?? ()
39 LWP 18102 "rpc worker-1810" 0x00007f4599be3ad3 in ?? ()
40 LWP 18103 "rpc worker-1810" 0x00007f4599be3ad3 in ?? ()
41 LWP 18104 "rpc worker-1810" 0x00007f4599be3ad3 in ?? ()
42 LWP 18105 "rpc worker-1810" 0x00007f4599be3ad3 in ?? ()
43 LWP 18106 "rpc worker-1810" 0x00007f4599be3ad3 in ?? ()
44 LWP 18107 "rpc worker-1810" 0x00007f4599be3ad3 in ?? ()
45 LWP 18108 "rpc worker-1810" 0x00007f4599be3ad3 in ?? ()
46 LWP 18109 "rpc worker-1810" 0x00007f4599be3ad3 in ?? ()
47 LWP 18110 "rpc worker-1811" 0x00007f4599be3ad3 in ?? ()
48 LWP 18111 "rpc worker-1811" 0x00007f4599be3ad3 in ?? ()
49 LWP 18112 "rpc worker-1811" 0x00007f4599be3ad3 in ?? ()
50 LWP 18113 "rpc worker-1811" 0x00007f4599be3ad3 in ?? ()
51 LWP 18114 "rpc worker-1811" 0x00007f4599be3ad3 in ?? ()
52 LWP 18115 "rpc worker-1811" 0x00007f4599be3ad3 in ?? ()
53 LWP 18116 "rpc worker-1811" 0x00007f4599be3ad3 in ?? ()
54 LWP 18117 "rpc worker-1811" 0x00007f4599be3ad3 in ?? ()
55 LWP 18118 "rpc worker-1811" 0x00007f4599be3ad3 in ?? ()
56 LWP 18119 "rpc worker-1811" 0x00007f4599be3ad3 in ?? ()
57 LWP 18120 "rpc worker-1812" 0x00007f4599be3ad3 in ?? ()
58 LWP 18121 "rpc worker-1812" 0x00007f4599be3ad3 in ?? ()
59 LWP 18122 "rpc worker-1812" 0x00007f4599be3ad3 in ?? ()
60 LWP 18123 "rpc worker-1812" 0x00007f4599be3ad3 in ?? ()
61 LWP 18124 "rpc worker-1812" 0x00007f4599be3ad3 in ?? ()
62 LWP 18125 "rpc worker-1812" 0x00007f4599be3ad3 in ?? ()
63 LWP 18126 "rpc worker-1812" 0x00007f4599be3ad3 in ?? ()
64 LWP 18127 "rpc worker-1812" 0x00007f4599be3ad3 in ?? ()
65 LWP 18128 "rpc worker-1812" 0x00007f4599be3ad3 in ?? ()
66 LWP 18129 "rpc worker-1812" 0x00007f4599be3ad3 in ?? ()
67 LWP 18130 "rpc worker-1813" 0x00007f4599be3ad3 in ?? ()
68 LWP 18131 "rpc worker-1813" 0x00007f4599be3ad3 in ?? ()
69 LWP 18132 "rpc worker-1813" 0x00007f4599be3ad3 in ?? ()
70 LWP 18133 "rpc worker-1813" 0x00007f4599be3ad3 in ?? ()
71 LWP 18134 "rpc worker-1813" 0x00007f4599be3ad3 in ?? ()
72 LWP 18135 "rpc worker-1813" 0x00007f4599be3ad3 in ?? ()
73 LWP 18136 "rpc worker-1813" 0x00007f4599be3ad3 in ?? ()
74 LWP 18137 "rpc worker-1813" 0x00007f4599be3ad3 in ?? ()
75 LWP 18138 "rpc worker-1813" 0x00007f4599be3ad3 in ?? ()
76 LWP 18139 "rpc worker-1813" 0x00007f4599be3ad3 in ?? ()
77 LWP 18140 "rpc worker-1814" 0x00007f4599be3ad3 in ?? ()
78 LWP 18141 "rpc worker-1814" 0x00007f4599be3ad3 in ?? ()
79 LWP 18142 "rpc worker-1814" 0x00007f4599be3ad3 in ?? ()
80 LWP 18143 "rpc worker-1814" 0x00007f4599be3ad3 in ?? ()
81 LWP 18144 "rpc worker-1814" 0x00007f4599be3ad3 in ?? ()
82 LWP 18145 "rpc worker-1814" 0x00007f4599be3ad3 in ?? ()
83 LWP 18146 "rpc worker-1814" 0x00007f4599be3ad3 in ?? ()
84 LWP 18147 "rpc worker-1814" 0x00007f4599be3ad3 in ?? ()
85 LWP 18148 "rpc worker-1814" 0x00007f4599be3ad3 in ?? ()
86 LWP 18149 "rpc worker-1814" 0x00007f4599be3ad3 in ?? ()
87 LWP 18150 "rpc worker-1815" 0x00007f4599be3ad3 in ?? ()
88 LWP 18151 "rpc worker-1815" 0x00007f4599be3ad3 in ?? ()
89 LWP 18152 "rpc worker-1815" 0x00007f4599be3ad3 in ?? ()
90 LWP 18153 "rpc worker-1815" 0x00007f4599be3ad3 in ?? ()
91 LWP 18154 "rpc worker-1815" 0x00007f4599be3ad3 in ?? ()
92 LWP 18155 "rpc worker-1815" 0x00007f4599be3ad3 in ?? ()
93 LWP 18156 "rpc worker-1815" 0x00007f4599be3ad3 in ?? ()
94 LWP 18157 "rpc worker-1815" 0x00007f4599be3ad3 in ?? ()
95 LWP 18158 "rpc worker-1815" 0x00007f4599be3ad3 in ?? ()
96 LWP 18159 "rpc worker-1815" 0x00007f4599be3ad3 in ?? ()
97 LWP 18160 "rpc worker-1816" 0x00007f4599be3ad3 in ?? ()
98 LWP 18161 "rpc worker-1816" 0x00007f4599be3ad3 in ?? ()
99 LWP 18162 "rpc worker-1816" 0x00007f4599be3ad3 in ?? ()
100 LWP 18163 "rpc worker-1816" 0x00007f4599be3ad3 in ?? ()
101 LWP 18164 "rpc worker-1816" 0x00007f4599be3ad3 in ?? ()
102 LWP 18165 "rpc worker-1816" 0x00007f4599be3ad3 in ?? ()
103 LWP 18166 "rpc worker-1816" 0x00007f4599be3ad3 in ?? ()
104 LWP 18167 "rpc worker-1816" 0x00007f4599be3ad3 in ?? ()
105 LWP 18168 "rpc worker-1816" 0x00007f4599be3ad3 in ?? ()
106 LWP 18169 "rpc worker-1816" 0x00007f4599be3ad3 in ?? ()
107 LWP 18170 "rpc worker-1817" 0x00007f4599be3ad3 in ?? ()
108 LWP 18171 "rpc worker-1817" 0x00007f4599be3ad3 in ?? ()
109 LWP 18172 "rpc worker-1817" 0x00007f4599be3ad3 in ?? ()
110 LWP 18173 "rpc worker-1817" 0x00007f4599be3ad3 in ?? ()
111 LWP 18174 "rpc worker-1817" 0x00007f4599be3ad3 in ?? ()
112 LWP 18175 "rpc worker-1817" 0x00007f4599be3ad3 in ?? ()
113 LWP 18176 "rpc worker-1817" 0x00007f4599be3ad3 in ?? ()
114 LWP 18177 "rpc worker-1817" 0x00007f4599be3ad3 in ?? ()
115 LWP 18178 "rpc worker-1817" 0x00007f4599be3ad3 in ?? ()
116 LWP 18179 "rpc worker-1817" 0x00007f4599be3ad3 in ?? ()
117 LWP 18180 "diag-logger-181" 0x00007f4599be3fb9 in ?? ()
118 LWP 18181 "result-tracker-" 0x00007f4599be3fb9 in ?? ()
119 LWP 18182 "excess-log-dele" 0x00007f4599be3fb9 in ?? ()
120 LWP 18183 "tcmalloc-memory" 0x00007f4599be3fb9 in ?? ()
121 LWP 18184 "acceptor-18184" 0x00007f4597c97fc7 in ?? ()
122 LWP 18185 "heartbeat-18185" 0x00007f4599be3fb9 in ?? ()
123 LWP 18186 "maintenance_sch" 0x00007f4599be3fb9 in ?? ()
Thread 123 (LWP 18186):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000028 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000559fd1695e50 in ?? ()
#5 0x00007f45510a2470 in ?? ()
#6 0x0000000000000050 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 18185):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000000a in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000559fd15e7630 in ?? ()
#5 0x00007f45518a33f0 in ?? ()
#6 0x0000000000000014 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 121 (LWP 18184):
#0 0x00007f4597c97fc7 in ?? ()
#1 0x00007f45520a40d8 in ?? ()
#2 0x0000000299834672 in ?? ()
#3 0x00007f4599653060 in ?? ()
#4 0x0000000000080800 in ?? ()
#5 0x00007f45520a43e0 in ?? ()
#6 0x00007f45520a4090 in ?? ()
#7 0x0000559fd15a0978 in ?? ()
#8 0x00007f459983a1c9 in ?? ()
#9 0x00007f45520a4510 in ?? ()
#10 0x00007f45520a4700 in ?? ()
#11 0x0000008000000003 in ?? ()
#12 0x00007f45520a40d8 in ?? ()
#13 0x00007f45520a40c0 in ?? ()
#14 0x00007f459929b9e1 in ?? ()
#15 0x4014000000000000 in ?? ()
#16 0x00007f45520a4078 in ?? ()
#17 0x0000000000000000 in ?? ()
Thread 120 (LWP 18183):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffe4684fa50 in ?? ()
#5 0x00007f45528a5670 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 18182):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 118 (LWP 18181):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000000a in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000559fd15183e0 in ?? ()
#5 0x00007f45538a7680 in ?? ()
#6 0x0000000000000014 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 18180):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000000a in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000559fd1826690 in ?? ()
#5 0x00007f45540a8550 in ?? ()
#6 0x0000000000000014 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 18179):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000003 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000559fd17f36bc in ?? ()
#4 0x00007f45548a95c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f45548a95e0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000559fd17f36a8 in ?? ()
#9 0x00007f4599be3770 in ?? ()
#10 0x00007f45548a95e0 in ?? ()
#11 0x00007f45548a9640 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 115 (LWP 18178):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000005 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000559fd17f363c in ?? ()
#4 0x00007f45550aa5c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f45550aa5e0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000559fd17f3628 in ?? ()
#9 0x00007f4599be3770 in ?? ()
#10 0x00007f45550aa5e0 in ?? ()
#11 0x00007f45550aa640 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 114 (LWP 18177):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 18176):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 18175):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 18174):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 18173):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 18172):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 18171):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 18170):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 18169):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 18168):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 18167):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 18166):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 18165):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 18164):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 18163):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 18162):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 18161):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 18160):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 18159):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 18158):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 18157):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 18156):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 18155):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 18154):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 18153):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 18152):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 18151):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 18150):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 18149):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 18148):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 18147):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 18146):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 18145):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 18144):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 18143):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 18142):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 18141):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 18140):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 18139):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000559fd17f20b8 in ?? ()
#4 0x00007f45688d15c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f45688d15e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 75 (LWP 18138):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 74 (LWP 18137):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 18136):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 18135):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 18134):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 18133):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 18132):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 18131):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 18130):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 18129):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 18128):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 18127):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 18126):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 18125):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 18124):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 18123):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 18122):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 18121):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 18120):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 18119):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000559fd16ef5b8 in ?? ()
#4 0x00007f45728e55c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f45728e55e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 55 (LWP 18118):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 18117):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 18116):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 18115):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 18114):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 18113):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 18112):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 18111):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 18110):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 18109):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 18108):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 18107):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 18106):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 18105):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 18104):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 18103):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 18102):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 18101):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 18100):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 36 (LWP 18099):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000559fd16eeab8 in ?? ()
#4 0x00007f457c8f95c0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f457c8f95e0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 35 (LWP 18098):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 34 (LWP 18097):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 33 (LWP 18096):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 32 (LWP 18095):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 31 (LWP 18094):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 30 (LWP 18093):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 18092):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 18091):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 18090):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 18089):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 18088):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 18087):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 18086):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 18085):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 18084):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 18083):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 18082):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 18081):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 18080):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 18079):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000559fd1647b20 in ?? ()
#5 0x00007f458690d580 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 15 (LWP 18078):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000559fd14feb88 in ?? ()
#5 0x00007f458710e6a0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 18077):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000559fd17000d0 in ?? ()
#5 0x00007f458790f3b0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 13 (LWP 18076):
#0 0x00007f4599be3ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 18075):
#0 0x00007f4597c96947 in ?? ()
#1 0x00007f4588911680 in ?? ()
#2 0x00007f459321c571 in ?? ()
#3 0x00007f4588911680 in ?? ()
#4 0x0000559fd15f9398 in ?? ()
#5 0x00007f45889116c0 in ?? ()
#6 0x00007f4588911840 in ?? ()
#7 0x0000559fd16a53f0 in ?? ()
#8 0x00007f459321e25d in ?? ()
#9 0x3fb965de3c67c000 in ?? ()
#10 0x0000559fd15eac00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000559fd15eac00 in ?? ()
#13 0x00000000d15f9398 in ?? ()
#14 0x0000559f00000000 in ?? ()
#15 0x41da7d802edba1c9 in ?? ()
#16 0x0000559fd16a53f0 in ?? ()
#17 0x00007f4588911720 in ?? ()
#18 0x00007f4593222ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb965de3c67c000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 18074):
#0 0x00007f4597c96947 in ?? ()
#1 0x00007f4589112680 in ?? ()
#2 0x00007f459321c571 in ?? ()
#3 0x00007f4589112680 in ?? ()
#4 0x0000559fd15f9018 in ?? ()
#5 0x00007f45891126c0 in ?? ()
#6 0x00007f4589112840 in ?? ()
#7 0x0000559fd16a53f0 in ?? ()
#8 0x00007f459321e25d in ?? ()
#9 0x3fb974aad1090000 in ?? ()
#10 0x0000559fd15e9600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000559fd15e9600 in ?? ()
#13 0x00000000d15f9018 in ?? ()
#14 0x0000559f00000000 in ?? ()
#15 0x41da7d802edba1c9 in ?? ()
#16 0x0000559fd16a53f0 in ?? ()
#17 0x00007f4589112720 in ?? ()
#18 0x00007f4593222ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb974aad1090000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 18073):
#0 0x00007f4597c96947 in ?? ()
#1 0x00007f4589913680 in ?? ()
#2 0x00007f459321c571 in ?? ()
#3 0x00007f4589913680 in ?? ()
#4 0x0000559fd15f9558 in ?? ()
#5 0x00007f45899136c0 in ?? ()
#6 0x00007f4589913840 in ?? ()
#7 0x0000559fd16a53f0 in ?? ()
#8 0x00007f459321e25d in ?? ()
#9 0x3fb9745b0a774000 in ?? ()
#10 0x0000559fd15e9b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000559fd15e9b80 in ?? ()
#13 0x00000000d15f9558 in ?? ()
#14 0x0000559f00000000 in ?? ()
#15 0x41da7d802edba1ca in ?? ()
#16 0x0000559fd16a53f0 in ?? ()
#17 0x00007f4589913720 in ?? ()
#18 0x00007f4593222ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb9745b0a774000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 9 (LWP 18072):
#0 0x00007f4597c96947 in ?? ()
#1 0x00007f458b4f7680 in ?? ()
#2 0x00007f459321c571 in ?? ()
#3 0x00007f458b4f7680 in ?? ()
#4 0x0000559fd15f8e58 in ?? ()
#5 0x00007f458b4f76c0 in ?? ()
#6 0x00007f458b4f7840 in ?? ()
#7 0x0000559fd16a53f0 in ?? ()
#8 0x00007f459321e25d in ?? ()
#9 0x3fb96e9698758000 in ?? ()
#10 0x0000559fd15ea100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000559fd15ea100 in ?? ()
#13 0x00000000d15f8e58 in ?? ()
#14 0x0000559f00000000 in ?? ()
#15 0x41da7d802edba1cb in ?? ()
#16 0x0000559fd16a53f0 in ?? ()
#17 0x00007f458b4f7720 in ?? ()
#18 0x00007f4593222ba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 18069):
#0 0x00007f4597c89bb9 in ?? ()
#1 0x00007f458ccfa840 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 18068):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 18067):
#0 0x00007f4599be79e2 in ?? ()
#1 0x0000559fd1519ee0 in ?? ()
#2 0x00007f458bcf84d0 in ?? ()
#3 0x00007f458bcf8450 in ?? ()
#4 0x00007f458bcf8570 in ?? ()
#5 0x00007f458bcf8790 in ?? ()
#6 0x00007f458bcf87a0 in ?? ()
#7 0x00007f458bcf84e0 in ?? ()
#8 0x00007f458bcf84d0 in ?? ()
#9 0x0000559fd1519c80 in ?? ()
#10 0x00007f459a1f397f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 18061):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000032 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000559fd169f4c8 in ?? ()
#5 0x00007f458dcfc430 in ?? ()
#6 0x0000000000000064 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 18060):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000559fd14fe848 in ?? ()
#5 0x00007f458e4fd790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 18059):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000559fd14fe2a8 in ?? ()
#5 0x00007f458ecfe790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 18058):
#0 0x00007f4599be3fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000559fd14fe188 in ?? ()
#5 0x00007f458f4ff790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 18057):
#0 0x00007f4599be7d50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260502 14:07:09.344470 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 18322
I20260502 14:07:09.356006 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 18189
I20260502 14:07:09.368359 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 18456
I20260502 14:07:09.379400 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 18057
I20260502 14:07:09.384729 12921 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskW6FTeC/build/release/bin/kudu with pid 14042
2026-05-02T14:07:09Z chronyd exiting
I20260502 14:07:09.400882 12921 test_util.cc:182] -----------------------------------------------
I20260502 14:07:09.400964 12921 test_util.cc:183] Had failures, leaving test files at /tmp/dist-test-taskW6FTeC/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777730766202997-12921-0
[ FAILED ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-C0 B9-13 56-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-C0 B9-13 56-55 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-C2 B9-13 56-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00> (47760 ms)
[----------] 1 test from RollingRestartArgs/RollingRestartITest (47760 ms total)
[----------] Global test environment tear-down
[==========] 2 tests from 2 test suites ran. (63193 ms total)
[ PASSED ] 1 test.
[ FAILED ] 1 test, listed below:
[ FAILED ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-C0 B9-13 56-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-C0 B9-13 56-55 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-C2 B9-13 56-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00>
1 FAILED TEST
I20260502 14:07:09.401548 12921 logging.cc:424] LogThrottler /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/client/meta_cache.cc:302: suppressed but not reported on 1 messages since previous log ~10 seconds ago