Note: This is test shard 1 of 8.
[==========] Running 2 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 1 test from MaintenanceModeRF3ITest
[ RUN ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate
2026-04-30T02:01:53Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-04-30T02:01:53Z Disabled control of system clock
WARNING: Logging before InitGoogleLogging() is written to STDERR
I20260430 02:01:53.858007 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.24.153.126:40723
--webserver_interface=127.24.153.126
--webserver_port=0
--builtin_ntp_servers=127.24.153.84:39365
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.24.153.126:40723 with env {}
W20260430 02:01:53.966969 25197 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:01:53.967227 25197 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:01:53.967268 25197 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:01:53.970959 25197 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260430 02:01:53.971060 25197 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:01:53.971083 25197 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260430 02:01:53.971102 25197 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260430 02:01:53.975597 25197 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:39365
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.24.153.126:40723
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.24.153.126:40723
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.24.153.126
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.25197
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:01:53.976766 25197 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:01:53.977939 25197 file_cache.cc:492] Constructed file cache file cache with capacity 419430
I20260430 02:01:53.985113 25197 server_base.cc:1061] running on GCE node
W20260430 02:01:53.985342 25205 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:01:53.985255 25203 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:01:53.985255 25202 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:01:53.986124 25197 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:01:53.987306 25197 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:01:53.988515 25197 hybrid_clock.cc:648] HybridClock initialized: now 1777514513988490 us; error 47 us; skew 500 ppm
I20260430 02:01:53.990708 25197 webserver.cc:492] Webserver started at http://127.24.153.126:38027/ using document root <none> and password file <none>
I20260430 02:01:53.991372 25197 fs_manager.cc:362] Metadata directory not provided
I20260430 02:01:53.991463 25197 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:01:53.991709 25197 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260430 02:01:53.993427 25197 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data/instance:
uuid: "10e131cec14c4db9a800c3f181cedd66"
format_stamp: "Formatted at 2026-04-30 02:01:53 on dist-test-slave-f7mg"
I20260430 02:01:53.994058 25197 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal/instance:
uuid: "10e131cec14c4db9a800c3f181cedd66"
format_stamp: "Formatted at 2026-04-30 02:01:53 on dist-test-slave-f7mg"
I20260430 02:01:53.997781 25197 fs_manager.cc:696] Time spent creating directory manager: real 0.003s user 0.002s sys 0.001s
I20260430 02:01:54.000731 25211 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:54.002090 25197 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20260430 02:01:54.002264 25197 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal
uuid: "10e131cec14c4db9a800c3f181cedd66"
format_stamp: "Formatted at 2026-04-30 02:01:53 on dist-test-slave-f7mg"
I20260430 02:01:54.002386 25197 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:01:54.015964 25197 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:01:54.016728 25197 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:01:54.017069 25197 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:01:54.025660 25263 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.126:40723 every 8 connection(s)
I20260430 02:01:54.025660 25197 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.126:40723
I20260430 02:01:54.026949 25197 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data/info.pb
I20260430 02:01:54.030409 25264 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.034746 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 25197
I20260430 02:01:54.034947 25189 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal/instance
I20260430 02:01:54.036403 25264 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66: Bootstrap starting.
I20260430 02:01:54.039089 25264 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.040022 25264 log.cc:826] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66: Log is configured to *not* fsync() on all Append() calls
I20260430 02:01:54.042132 25264 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66: No bootstrap required, opened a new log
I20260430 02:01:54.045143 25264 raft_consensus.cc:359] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } }
I20260430 02:01:54.045382 25264 raft_consensus.cc:385] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.045449 25264 raft_consensus.cc:740] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 10e131cec14c4db9a800c3f181cedd66, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.045979 25264 consensus_queue.cc:260] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } }
I20260430 02:01:54.046151 25264 raft_consensus.cc:399] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260430 02:01:54.046221 25264 raft_consensus.cc:493] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260430 02:01:54.046330 25264 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.047355 25264 raft_consensus.cc:515] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } }
I20260430 02:01:54.047686 25264 leader_election.cc:304] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 10e131cec14c4db9a800c3f181cedd66; no voters:
I20260430 02:01:54.047961 25264 leader_election.cc:290] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [CANDIDATE]: Term 1 election: Requested vote from peers
I20260430 02:01:54.048132 25269 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 1 FOLLOWER]: Leader election won for term 1
I20260430 02:01:54.048372 25269 raft_consensus.cc:697] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 1 LEADER]: Becoming Leader. State: Replica: 10e131cec14c4db9a800c3f181cedd66, State: Running, Role: LEADER
I20260430 02:01:54.048753 25269 consensus_queue.cc:237] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } }
I20260430 02:01:54.049321 25264 sys_catalog.cc:565] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [sys.catalog]: configured and running, proceeding with master startup.
I20260430 02:01:54.050509 25270 sys_catalog.cc:455] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "10e131cec14c4db9a800c3f181cedd66" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } } }
I20260430 02:01:54.050707 25270 sys_catalog.cc:458] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [sys.catalog]: This master's current role is: LEADER
I20260430 02:01:54.051745 25278 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260430 02:01:54.050510 25271 sys_catalog.cc:455] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 10e131cec14c4db9a800c3f181cedd66. Latest consensus state: current_term: 1 leader_uuid: "10e131cec14c4db9a800c3f181cedd66" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } } }
I20260430 02:01:54.052451 25271 sys_catalog.cc:458] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [sys.catalog]: This master's current role is: LEADER
I20260430 02:01:54.055647 25278 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260430 02:01:54.061316 25278 catalog_manager.cc:1357] Generated new cluster ID: 7aa1ab3da6514d7f8f19e5fdffeb4a89
I20260430 02:01:54.061412 25278 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260430 02:01:54.088374 25278 catalog_manager.cc:1380] Generated new certificate authority record
I20260430 02:01:54.089358 25278 catalog_manager.cc:1514] Loading token signing keys...
I20260430 02:01:54.101946 25278 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66: Generated new TSK 0
I20260430 02:01:54.102906 25278 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260430 02:01:54.107098 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.65:0
--local_ip_for_outbound_sockets=127.24.153.65
--webserver_interface=127.24.153.65
--webserver_port=0
--tserver_master_addrs=127.24.153.126:40723
--builtin_ntp_servers=127.24.153.84:39365
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20260430 02:01:54.223106 25288 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:01:54.223379 25288 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:01:54.223443 25288 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260430 02:01:54.223510 25288 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:01:54.227195 25288 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:01:54.227391 25288 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.65
I20260430 02:01:54.231635 25288 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:39365
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.65:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.24.153.65
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.24.153.126:40723
--never_fsync=true
--heap_profile_path=/tmp/kudu.25288
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.65
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:01:54.232941 25288 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:01:54.234266 25288 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:01:54.237720 25288 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:01:54.242538 25296 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:01:54.242767 25288 server_base.cc:1061] running on GCE node
W20260430 02:01:54.242516 25293 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:01:54.242511 25294 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:01:54.243299 25288 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:01:54.243994 25288 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:01:54.245222 25288 hybrid_clock.cc:648] HybridClock initialized: now 1777514514245178 us; error 58 us; skew 500 ppm
I20260430 02:01:54.247470 25288 webserver.cc:492] Webserver started at http://127.24.153.65:38833/ using document root <none> and password file <none>
I20260430 02:01:54.248371 25288 fs_manager.cc:362] Metadata directory not provided
I20260430 02:01:54.248437 25288 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:01:54.248664 25288 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260430 02:01:54.250423 25288 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data/instance:
uuid: "e481cdb11e4040f2b78a224a2bed5eaa"
format_stamp: "Formatted at 2026-04-30 02:01:54 on dist-test-slave-f7mg"
I20260430 02:01:54.251055 25288 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal/instance:
uuid: "e481cdb11e4040f2b78a224a2bed5eaa"
format_stamp: "Formatted at 2026-04-30 02:01:54 on dist-test-slave-f7mg"
I20260430 02:01:54.254941 25288 fs_manager.cc:696] Time spent creating directory manager: real 0.004s user 0.003s sys 0.001s
I20260430 02:01:54.257416 25302 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:54.258620 25288 fs_manager.cc:730] Time spent opening block manager: real 0.002s user 0.002s sys 0.000s
I20260430 02:01:54.258778 25288 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal
uuid: "e481cdb11e4040f2b78a224a2bed5eaa"
format_stamp: "Formatted at 2026-04-30 02:01:54 on dist-test-slave-f7mg"
I20260430 02:01:54.258929 25288 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:01:54.277287 25288 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:01:54.278084 25288 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:01:54.278298 25288 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:01:54.278918 25288 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:01:54.280038 25288 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:01:54.280117 25288 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:54.280195 25288 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:01:54.280243 25288 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:54.292117 25288 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.65:44971
I20260430 02:01:54.292145 25415 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.65:44971 every 8 connection(s)
I20260430 02:01:54.293237 25288 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
I20260430 02:01:54.294126 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 25288
I20260430 02:01:54.294209 25189 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal/instance
I20260430 02:01:54.296828 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.66:0
--local_ip_for_outbound_sockets=127.24.153.66
--webserver_interface=127.24.153.66
--webserver_port=0
--tserver_master_addrs=127.24.153.126:40723
--builtin_ntp_servers=127.24.153.84:39365
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20260430 02:01:54.305085 25416 heartbeater.cc:344] Connected to a master server at 127.24.153.126:40723
I20260430 02:01:54.305394 25416 heartbeater.cc:461] Registering TS with master...
I20260430 02:01:54.306102 25416 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:01:54.308151 25228 ts_manager.cc:194] Registered new tserver with Master: e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971)
I20260430 02:01:54.309424 25228 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.65:34749
W20260430 02:01:54.422327 25419 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:01:54.422684 25419 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:01:54.422761 25419 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260430 02:01:54.422827 25419 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:01:54.427029 25419 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:01:54.427158 25419 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.66
I20260430 02:01:54.431746 25419 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:39365
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.66:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.24.153.66
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.24.153.126:40723
--never_fsync=true
--heap_profile_path=/tmp/kudu.25419
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.66
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:01:54.433046 25419 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:01:54.434333 25419 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:01:54.437155 25419 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:01:54.441515 25427 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:01:54.441524 25424 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:01:54.441556 25419 server_base.cc:1061] running on GCE node
W20260430 02:01:54.441565 25425 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:01:54.442210 25419 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:01:54.442816 25419 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:01:54.443986 25419 hybrid_clock.cc:648] HybridClock initialized: now 1777514514443962 us; error 45 us; skew 500 ppm
I20260430 02:01:54.446175 25419 webserver.cc:492] Webserver started at http://127.24.153.66:46521/ using document root <none> and password file <none>
I20260430 02:01:54.446739 25419 fs_manager.cc:362] Metadata directory not provided
I20260430 02:01:54.446792 25419 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:01:54.446998 25419 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260430 02:01:54.448684 25419 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/data/instance:
uuid: "f5202ea2c8244e849a11073ee5d918c5"
format_stamp: "Formatted at 2026-04-30 02:01:54 on dist-test-slave-f7mg"
I20260430 02:01:54.449182 25419 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/wal/instance:
uuid: "f5202ea2c8244e849a11073ee5d918c5"
format_stamp: "Formatted at 2026-04-30 02:01:54 on dist-test-slave-f7mg"
I20260430 02:01:54.453058 25419 fs_manager.cc:696] Time spent creating directory manager: real 0.004s user 0.003s sys 0.003s
I20260430 02:01:54.455451 25433 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:54.456516 25419 fs_manager.cc:730] Time spent opening block manager: real 0.002s user 0.001s sys 0.000s
I20260430 02:01:54.456622 25419 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/wal
uuid: "f5202ea2c8244e849a11073ee5d918c5"
format_stamp: "Formatted at 2026-04-30 02:01:54 on dist-test-slave-f7mg"
I20260430 02:01:54.456776 25419 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:01:54.469183 25419 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:01:54.470017 25419 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:01:54.470227 25419 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:01:54.470891 25419 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:01:54.472005 25419 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:01:54.472061 25419 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:54.472141 25419 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:01:54.472184 25419 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:54.482973 25419 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.66:43745
I20260430 02:01:54.483002 25546 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.66:43745 every 8 connection(s)
I20260430 02:01:54.484201 25419 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
I20260430 02:01:54.493821 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 25419
I20260430 02:01:54.494167 25189 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-1/wal/instance
I20260430 02:01:54.497651 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.67:0
--local_ip_for_outbound_sockets=127.24.153.67
--webserver_interface=127.24.153.67
--webserver_port=0
--tserver_master_addrs=127.24.153.126:40723
--builtin_ntp_servers=127.24.153.84:39365
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20260430 02:01:54.499115 25547 heartbeater.cc:344] Connected to a master server at 127.24.153.126:40723
I20260430 02:01:54.499410 25547 heartbeater.cc:461] Registering TS with master...
I20260430 02:01:54.500362 25547 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:01:54.501832 25228 ts_manager.cc:194] Registered new tserver with Master: f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:54.502715 25228 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.66:49337
W20260430 02:01:54.624984 25550 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:01:54.625218 25550 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:01:54.625253 25550 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260430 02:01:54.625315 25550 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:01:54.628818 25550 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:01:54.628949 25550 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.67
I20260430 02:01:54.632989 25550 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:39365
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.67:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.24.153.67
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.24.153.126:40723
--never_fsync=true
--heap_profile_path=/tmp/kudu.25550
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.67
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:01:54.634434 25550 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:01:54.635622 25550 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:01:54.638708 25550 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:01:54.643487 25558 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:01:54.643509 25556 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:01:54.643488 25555 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:01:54.643641 25550 server_base.cc:1061] running on GCE node
I20260430 02:01:54.644160 25550 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:01:54.644819 25550 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:01:54.646034 25550 hybrid_clock.cc:648] HybridClock initialized: now 1777514514646005 us; error 43 us; skew 500 ppm
I20260430 02:01:54.648444 25550 webserver.cc:492] Webserver started at http://127.24.153.67:46071/ using document root <none> and password file <none>
I20260430 02:01:54.649077 25550 fs_manager.cc:362] Metadata directory not provided
I20260430 02:01:54.649138 25550 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:01:54.649353 25550 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260430 02:01:54.651118 25550 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/data/instance:
uuid: "a4ec6dffeb11435b8655672771cd29c4"
format_stamp: "Formatted at 2026-04-30 02:01:54 on dist-test-slave-f7mg"
I20260430 02:01:54.651645 25550 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/wal/instance:
uuid: "a4ec6dffeb11435b8655672771cd29c4"
format_stamp: "Formatted at 2026-04-30 02:01:54 on dist-test-slave-f7mg"
I20260430 02:01:54.655381 25550 fs_manager.cc:696] Time spent creating directory manager: real 0.003s user 0.004s sys 0.000s
I20260430 02:01:54.658236 25564 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:54.659425 25550 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20260430 02:01:54.659592 25550 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/wal
uuid: "a4ec6dffeb11435b8655672771cd29c4"
format_stamp: "Formatted at 2026-04-30 02:01:54 on dist-test-slave-f7mg"
I20260430 02:01:54.659715 25550 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:01:54.688541 25550 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:01:54.689317 25550 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:01:54.689543 25550 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:01:54.690194 25550 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:01:54.691280 25550 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:01:54.691357 25550 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:54.691431 25550 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:01:54.691485 25550 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:54.701113 25550 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.67:34223
I20260430 02:01:54.701175 25677 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.67:34223 every 8 connection(s)
I20260430 02:01:54.702242 25550 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
I20260430 02:01:54.705684 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 25550
I20260430 02:01:54.705798 25189 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-2/wal/instance
I20260430 02:01:54.713083 25678 heartbeater.cc:344] Connected to a master server at 127.24.153.126:40723
I20260430 02:01:54.713338 25678 heartbeater.cc:461] Registering TS with master...
I20260430 02:01:54.713994 25678 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:01:54.715135 25228 ts_manager.cc:194] Registered new tserver with Master: a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:54.715812 25228 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.67:56391
I20260430 02:01:54.721020 25189 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20260430 02:01:54.748454 25228 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:44778:
name: "test-workload"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
rows: "<redacted>""\004\001\000UUU\025\004\001\000\252\252\252*\004\001\000\377\377\377?\004\001\000TUUU\004\001\000\251\252\252j"
indirect_data: "<redacted>"""
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
W20260430 02:01:54.750207 25228 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-workload in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260430 02:01:54.773945 25349 tablet_service.cc:1511] Processing CreateTablet for tablet 29e24069ebd04300a2c4cf4d4bdc5e66 (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION VALUES < 357913941
I20260430 02:01:54.773404 25350 tablet_service.cc:1511] Processing CreateTablet for tablet 2c080a5b649f453c903e1dce5ab6a113 (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20260430 02:01:54.774412 25347 tablet_service.cc:1511] Processing CreateTablet for tablet f1dbbbee06674a93a3dd31c45c90d59d (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20260430 02:01:54.773553 25345 tablet_service.cc:1511] Processing CreateTablet for tablet 63c4457448ec4b1b8a0741a7560cddfe (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20260430 02:01:54.774183 25348 tablet_service.cc:1511] Processing CreateTablet for tablet fbfe4b6b54594e36a76de6e54d2adb8c (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20260430 02:01:54.775067 25345 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 63c4457448ec4b1b8a0741a7560cddfe. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.773404 25346 tablet_service.cc:1511] Processing CreateTablet for tablet 869a29991dbe4537bf1082d1c9ee2ecd (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20260430 02:01:54.774204 25611 tablet_service.cc:1511] Processing CreateTablet for tablet 2c080a5b649f453c903e1dce5ab6a113 (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20260430 02:01:54.775432 25348 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fbfe4b6b54594e36a76de6e54d2adb8c. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.775621 25347 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f1dbbbee06674a93a3dd31c45c90d59d. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.775774 25611 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 2c080a5b649f453c903e1dce5ab6a113. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.776820 25349 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 29e24069ebd04300a2c4cf4d4bdc5e66. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.777567 25350 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 2c080a5b649f453c903e1dce5ab6a113. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.774442 25607 tablet_service.cc:1511] Processing CreateTablet for tablet 63c4457448ec4b1b8a0741a7560cddfe (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20260430 02:01:54.778203 25607 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 63c4457448ec4b1b8a0741a7560cddfe. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.781113 25346 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 869a29991dbe4537bf1082d1c9ee2ecd. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.774327 25608 tablet_service.cc:1511] Processing CreateTablet for tablet 869a29991dbe4537bf1082d1c9ee2ecd (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20260430 02:01:54.782212 25608 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 869a29991dbe4537bf1082d1c9ee2ecd. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.785185 25701 tablet_bootstrap.cc:492] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4: Bootstrap starting.
I20260430 02:01:54.774068 25610 tablet_service.cc:1511] Processing CreateTablet for tablet fbfe4b6b54594e36a76de6e54d2adb8c (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20260430 02:01:54.785795 25610 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fbfe4b6b54594e36a76de6e54d2adb8c. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.785956 25700 tablet_bootstrap.cc:492] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
I20260430 02:01:54.774559 25612 tablet_service.cc:1511] Processing CreateTablet for tablet 29e24069ebd04300a2c4cf4d4bdc5e66 (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION VALUES < 357913941
I20260430 02:01:54.786221 25612 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 29e24069ebd04300a2c4cf4d4bdc5e66. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.788008 25701 tablet_bootstrap.cc:654] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.788486 25700 tablet_bootstrap.cc:654] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.788822 25701 log.cc:826] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4: Log is configured to *not* fsync() on all Append() calls
I20260430 02:01:54.789381 25700 log.cc:826] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa: Log is configured to *not* fsync() on all Append() calls
I20260430 02:01:54.773397 25609 tablet_service.cc:1511] Processing CreateTablet for tablet f1dbbbee06674a93a3dd31c45c90d59d (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20260430 02:01:54.790124 25609 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f1dbbbee06674a93a3dd31c45c90d59d. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.790880 25701 tablet_bootstrap.cc:492] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4: No bootstrap required, opened a new log
I20260430 02:01:54.791131 25701 ts_tablet_manager.cc:1403] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4: Time spent bootstrapping tablet: real 0.006s user 0.006s sys 0.000s
I20260430 02:01:54.791245 25700 tablet_bootstrap.cc:492] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa: No bootstrap required, opened a new log
I20260430 02:01:54.791460 25700 ts_tablet_manager.cc:1403] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.006s user 0.005s sys 0.000s
I20260430 02:01:54.795239 25700 raft_consensus.cc:359] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.795239 25701 raft_consensus.cc:359] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.795490 25701 raft_consensus.cc:385] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.795590 25701 raft_consensus.cc:740] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a4ec6dffeb11435b8655672771cd29c4, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.795714 25700 raft_consensus.cc:385] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.795841 25700 raft_consensus.cc:740] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.796126 25701 consensus_queue.cc:260] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.796437 25700 consensus_queue.cc:260] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.796192 25481 tablet_service.cc:1511] Processing CreateTablet for tablet 29e24069ebd04300a2c4cf4d4bdc5e66 (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION VALUES < 357913941
I20260430 02:01:54.796798 25479 tablet_service.cc:1511] Processing CreateTablet for tablet fbfe4b6b54594e36a76de6e54d2adb8c (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20260430 02:01:54.797109 25701 ts_tablet_manager.cc:1434] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4: Time spent starting tablet: real 0.006s user 0.005s sys 0.000s
I20260430 02:01:54.797312 25480 tablet_service.cc:1511] Processing CreateTablet for tablet 2c080a5b649f453c903e1dce5ab6a113 (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20260430 02:01:54.797477 25678 heartbeater.cc:499] Master 127.24.153.126:40723 was elected leader, sending a full tablet report...
I20260430 02:01:54.797577 25700 ts_tablet_manager.cc:1434] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.006s user 0.005s sys 0.000s
I20260430 02:01:54.796192 25478 tablet_service.cc:1511] Processing CreateTablet for tablet f1dbbbee06674a93a3dd31c45c90d59d (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20260430 02:01:54.797650 25479 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fbfe4b6b54594e36a76de6e54d2adb8c. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.797933 25416 heartbeater.cc:499] Master 127.24.153.126:40723 was elected leader, sending a full tablet report...
I20260430 02:01:54.797989 25700 tablet_bootstrap.cc:492] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
I20260430 02:01:54.796305 25477 tablet_service.cc:1511] Processing CreateTablet for tablet 869a29991dbe4537bf1082d1c9ee2ecd (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20260430 02:01:54.799130 25477 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 869a29991dbe4537bf1082d1c9ee2ecd. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.799293 25700 tablet_bootstrap.cc:654] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.800506 25480 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 2c080a5b649f453c903e1dce5ab6a113. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.800621 25700 tablet_bootstrap.cc:492] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa: No bootstrap required, opened a new log
I20260430 02:01:54.800755 25700 ts_tablet_manager.cc:1403] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.003s user 0.002s sys 0.000s
I20260430 02:01:54.801479 25700 raft_consensus.cc:359] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.801837 25478 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f1dbbbee06674a93a3dd31c45c90d59d. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.796679 25476 tablet_service.cc:1511] Processing CreateTablet for tablet 63c4457448ec4b1b8a0741a7560cddfe (DEFAULT_TABLE table=test-workload [id=86672283a7964500ab7eab1ca56ae85d]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20260430 02:01:54.801723 25700 raft_consensus.cc:385] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.802078 25700 raft_consensus.cc:740] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.802220 25481 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 29e24069ebd04300a2c4cf4d4bdc5e66. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.802384 25700 consensus_queue.cc:260] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.802872 25700 ts_tablet_manager.cc:1434] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.002s user 0.001s sys 0.000s
I20260430 02:01:54.803093 25700 tablet_bootstrap.cc:492] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
I20260430 02:01:54.804042 25476 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 63c4457448ec4b1b8a0741a7560cddfe. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:01:54.804388 25700 tablet_bootstrap.cc:654] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.806695 25700 tablet_bootstrap.cc:492] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa: No bootstrap required, opened a new log
I20260430 02:01:54.806856 25700 ts_tablet_manager.cc:1403] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.004s user 0.002s sys 0.000s
I20260430 02:01:54.807528 25700 raft_consensus.cc:359] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.807647 25700 raft_consensus.cc:385] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.807682 25700 raft_consensus.cc:740] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.807922 25700 consensus_queue.cc:260] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.808256 25700 ts_tablet_manager.cc:1434] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.001s user 0.001s sys 0.000s
I20260430 02:01:54.808475 25700 tablet_bootstrap.cc:492] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
I20260430 02:01:54.809132 25709 tablet_bootstrap.cc:492] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5: Bootstrap starting.
I20260430 02:01:54.809646 25700 tablet_bootstrap.cc:654] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.811064 25700 tablet_bootstrap.cc:492] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa: No bootstrap required, opened a new log
I20260430 02:01:54.811154 25700 ts_tablet_manager.cc:1403] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.003s user 0.002s sys 0.000s
I20260430 02:01:54.811672 25700 raft_consensus.cc:359] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.811792 25700 raft_consensus.cc:385] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.811831 25700 raft_consensus.cc:740] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.811872 25709 tablet_bootstrap.cc:654] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.811962 25700 consensus_queue.cc:260] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.812176 25701 tablet_bootstrap.cc:492] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4: Bootstrap starting.
I20260430 02:01:54.812242 25700 ts_tablet_manager.cc:1434] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.001s user 0.001s sys 0.000s
I20260430 02:01:54.812384 25705 raft_consensus.cc:493] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:01:54.812505 25705 raft_consensus.cc:515] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.812793 25709 log.cc:826] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5: Log is configured to *not* fsync() on all Append() calls
I20260430 02:01:54.813302 25701 tablet_bootstrap.cc:654] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.814468 25705 leader_election.cc:290] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:54.814736 25700 tablet_bootstrap.cc:492] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
I20260430 02:01:54.815454 25709 tablet_bootstrap.cc:492] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5: No bootstrap required, opened a new log
I20260430 02:01:54.815660 25709 ts_tablet_manager.cc:1403] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5: Time spent bootstrapping tablet: real 0.007s user 0.004s sys 0.000s
I20260430 02:01:54.815768 25700 tablet_bootstrap.cc:654] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.819160 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "2c080a5b649f453c903e1dce5ab6a113" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:01:54.819286 25709 raft_consensus.cc:359] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:54.819432 25709 raft_consensus.cc:385] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.819465 25709 raft_consensus.cc:740] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f5202ea2c8244e849a11073ee5d918c5, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.819931 25709 consensus_queue.cc:260] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:54.820365 25700 tablet_bootstrap.cc:492] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa: No bootstrap required, opened a new log
I20260430 02:01:54.820493 25700 ts_tablet_manager.cc:1403] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.006s user 0.002s sys 0.000s
W20260430 02:01:54.820513 25306 leader_election.cc:343] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:01:54.820631 25547 heartbeater.cc:499] Master 127.24.153.126:40723 was elected leader, sending a full tablet report...
I20260430 02:01:54.820878 25709 ts_tablet_manager.cc:1434] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5: Time spent starting tablet: real 0.005s user 0.006s sys 0.000s
I20260430 02:01:54.820948 25700 raft_consensus.cc:359] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:54.821094 25700 raft_consensus.cc:385] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.821137 25700 raft_consensus.cc:740] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.821143 25709 tablet_bootstrap.cc:492] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5: Bootstrap starting.
I20260430 02:01:54.821291 25700 consensus_queue.cc:260] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:54.821552 25700 ts_tablet_manager.cc:1434] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.001s user 0.001s sys 0.000s
I20260430 02:01:54.821812 25700 tablet_bootstrap.cc:492] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
I20260430 02:01:54.822925 25700 tablet_bootstrap.cc:654] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.823398 25709 tablet_bootstrap.cc:654] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.825094 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "2c080a5b649f453c903e1dce5ab6a113" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:01:54.825667 25700 tablet_bootstrap.cc:492] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa: No bootstrap required, opened a new log
I20260430 02:01:54.825759 25700 ts_tablet_manager.cc:1403] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.004s user 0.002s sys 0.000s
W20260430 02:01:54.825805 25305 leader_election.cc:343] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:01:54.825947 25305 leader_election.cc:304] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e481cdb11e4040f2b78a224a2bed5eaa; no voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5
I20260430 02:01:54.826211 25705 raft_consensus.cc:2749] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20260430 02:01:54.826674 25700 raft_consensus.cc:359] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.827033 25700 raft_consensus.cc:385] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.827127 25700 raft_consensus.cc:740] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.827152 25701 tablet_bootstrap.cc:492] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4: No bootstrap required, opened a new log
I20260430 02:01:54.827239 25701 ts_tablet_manager.cc:1403] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4: Time spent bootstrapping tablet: real 0.015s user 0.002s sys 0.000s
I20260430 02:01:54.827241 25709 tablet_bootstrap.cc:492] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5: No bootstrap required, opened a new log
I20260430 02:01:54.827330 25709 ts_tablet_manager.cc:1403] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5: Time spent bootstrapping tablet: real 0.006s user 0.002s sys 0.000s
I20260430 02:01:54.827437 25700 consensus_queue.cc:260] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.827809 25700 ts_tablet_manager.cc:1434] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.002s user 0.000s sys 0.002s
I20260430 02:01:54.827813 25701 raft_consensus.cc:359] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.827904 25701 raft_consensus.cc:385] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.827929 25701 raft_consensus.cc:740] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a4ec6dffeb11435b8655672771cd29c4, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.827930 25709 raft_consensus.cc:359] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.828049 25709 raft_consensus.cc:385] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.828095 25709 raft_consensus.cc:740] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f5202ea2c8244e849a11073ee5d918c5, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.828065 25701 consensus_queue.cc:260] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.828223 25709 consensus_queue.cc:260] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.828490 25709 ts_tablet_manager.cc:1434] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5: Time spent starting tablet: real 0.001s user 0.000s sys 0.000s
I20260430 02:01:54.828591 25701 ts_tablet_manager.cc:1434] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4: Time spent starting tablet: real 0.001s user 0.002s sys 0.000s
I20260430 02:01:54.828680 25709 tablet_bootstrap.cc:492] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5: Bootstrap starting.
I20260430 02:01:54.829998 25701 tablet_bootstrap.cc:492] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4: Bootstrap starting.
I20260430 02:01:54.830081 25709 tablet_bootstrap.cc:654] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.832376 25701 tablet_bootstrap.cc:654] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.833586 25701 tablet_bootstrap.cc:492] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4: No bootstrap required, opened a new log
I20260430 02:01:54.833578 25709 tablet_bootstrap.cc:492] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5: No bootstrap required, opened a new log
I20260430 02:01:54.833712 25701 ts_tablet_manager.cc:1403] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4: Time spent bootstrapping tablet: real 0.004s user 0.002s sys 0.000s
I20260430 02:01:54.833743 25709 ts_tablet_manager.cc:1403] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5: Time spent bootstrapping tablet: real 0.005s user 0.000s sys 0.002s
I20260430 02:01:54.834237 25701 raft_consensus.cc:359] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:54.834363 25701 raft_consensus.cc:385] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.834311 25709 raft_consensus.cc:359] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.834412 25701 raft_consensus.cc:740] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a4ec6dffeb11435b8655672771cd29c4, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.834462 25709 raft_consensus.cc:385] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.834509 25709 raft_consensus.cc:740] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f5202ea2c8244e849a11073ee5d918c5, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.834561 25701 consensus_queue.cc:260] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:54.834762 25709 consensus_queue.cc:260] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.834925 25701 ts_tablet_manager.cc:1434] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4: Time spent starting tablet: real 0.001s user 0.001s sys 0.000s
I20260430 02:01:54.835129 25709 ts_tablet_manager.cc:1434] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5: Time spent starting tablet: real 0.001s user 0.000s sys 0.002s
I20260430 02:01:54.835182 25701 tablet_bootstrap.cc:492] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4: Bootstrap starting.
I20260430 02:01:54.835424 25709 tablet_bootstrap.cc:492] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5: Bootstrap starting.
I20260430 02:01:54.836476 25701 tablet_bootstrap.cc:654] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.836578 25709 tablet_bootstrap.cc:654] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.838112 25709 tablet_bootstrap.cc:492] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5: No bootstrap required, opened a new log
I20260430 02:01:54.838084 25704 raft_consensus.cc:493] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:01:54.838217 25709 ts_tablet_manager.cc:1403] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5: Time spent bootstrapping tablet: real 0.003s user 0.001s sys 0.001s
I20260430 02:01:54.838263 25704 raft_consensus.cc:515] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.838816 25701 tablet_bootstrap.cc:492] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4: No bootstrap required, opened a new log
I20260430 02:01:54.838851 25709 raft_consensus.cc:359] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.838919 25701 ts_tablet_manager.cc:1403] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4: Time spent bootstrapping tablet: real 0.004s user 0.002s sys 0.000s
I20260430 02:01:54.838958 25709 raft_consensus.cc:385] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.838992 25709 raft_consensus.cc:740] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f5202ea2c8244e849a11073ee5d918c5, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.839186 25709 consensus_queue.cc:260] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.839538 25709 ts_tablet_manager.cc:1434] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5: Time spent starting tablet: real 0.001s user 0.000s sys 0.001s
I20260430 02:01:54.839566 25704 leader_election.cc:290] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:54.839740 25709 tablet_bootstrap.cc:492] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5: Bootstrap starting.
I20260430 02:01:54.839731 25701 raft_consensus.cc:359] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.839833 25701 raft_consensus.cc:385] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.839866 25701 raft_consensus.cc:740] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a4ec6dffeb11435b8655672771cd29c4, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.839969 25701 consensus_queue.cc:260] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.840332 25701 ts_tablet_manager.cc:1434] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4: Time spent starting tablet: real 0.001s user 0.001s sys 0.000s
I20260430 02:01:54.840516 25701 tablet_bootstrap.cc:492] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4: Bootstrap starting.
I20260430 02:01:54.842921 25724 raft_consensus.cc:493] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:01:54.843146 25724 raft_consensus.cc:515] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.843564 25701 tablet_bootstrap.cc:654] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.844883 25701 tablet_bootstrap.cc:492] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4: No bootstrap required, opened a new log
I20260430 02:01:54.844869 25724 leader_election.cc:290] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:54.844983 25701 ts_tablet_manager.cc:1403] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4: Time spent bootstrapping tablet: real 0.005s user 0.002s sys 0.000s
I20260430 02:01:54.845716 25701 raft_consensus.cc:359] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.845860 25701 raft_consensus.cc:385] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.845903 25701 raft_consensus.cc:740] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a4ec6dffeb11435b8655672771cd29c4, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.846064 25701 consensus_queue.cc:260] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.846359 25701 ts_tablet_manager.cc:1434] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4: Time spent starting tablet: real 0.001s user 0.000s sys 0.002s
I20260430 02:01:54.846577 25701 tablet_bootstrap.cc:492] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4: Bootstrap starting.
I20260430 02:01:54.847318 25370 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "869a29991dbe4537bf1082d1c9ee2ecd" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" is_pre_election: true
I20260430 02:01:54.847527 25370 raft_consensus.cc:2468] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a4ec6dffeb11435b8655672771cd29c4 in term 0.
I20260430 02:01:54.847819 25701 tablet_bootstrap.cc:654] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.848018 25568 leader_election.cc:304] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa; no voters:
I20260430 02:01:54.848573 25704 raft_consensus.cc:2804] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260430 02:01:54.848675 25704 raft_consensus.cc:493] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:01:54.848726 25704 raft_consensus.cc:3060] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.848942 25709 tablet_bootstrap.cc:654] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.849421 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "63c4457448ec4b1b8a0741a7560cddfe" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:01:54.849587 25632 raft_consensus.cc:2468] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f5202ea2c8244e849a11073ee5d918c5 in term 0.
I20260430 02:01:54.850010 25436 leader_election.cc:304] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5; no voters:
I20260430 02:01:54.850288 25716 raft_consensus.cc:2804] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260430 02:01:54.850225 25704 raft_consensus.cc:515] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.850355 25716 raft_consensus.cc:493] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:01:54.850414 25716 raft_consensus.cc:3060] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.850632 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "869a29991dbe4537bf1082d1c9ee2ecd" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:01:54.850723 25704 leader_election.cc:290] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 1 election: Requested vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:54.851120 25370 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "869a29991dbe4537bf1082d1c9ee2ecd" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "e481cdb11e4040f2b78a224a2bed5eaa"
I20260430 02:01:54.851246 25370 raft_consensus.cc:3060] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.851259 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "869a29991dbe4537bf1082d1c9ee2ecd" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5"
I20260430 02:01:54.851574 25716 raft_consensus.cc:515] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
W20260430 02:01:54.851662 25568 leader_election.cc:343] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745): Illegal state: must be running to vote when last-logged opid is not known
W20260430 02:01:54.852046 25568 leader_election.cc:343] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 1 election: Tablet error from VoteRequest() call to peer f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:01:54.852303 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "63c4457448ec4b1b8a0741a7560cddfe" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4"
I20260430 02:01:54.852396 25370 raft_consensus.cc:2468] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a4ec6dffeb11435b8655672771cd29c4 in term 1.
I20260430 02:01:54.852419 25632 raft_consensus.cc:3060] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.852738 25568 leader_election.cc:304] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa; no voters: f5202ea2c8244e849a11073ee5d918c5
I20260430 02:01:54.852962 25704 raft_consensus.cc:2804] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader election won for term 1
I20260430 02:01:54.853046 25704 raft_consensus.cc:697] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 1 LEADER]: Becoming Leader. State: Replica: a4ec6dffeb11435b8655672771cd29c4, State: Running, Role: LEADER
I20260430 02:01:54.853353 25704 consensus_queue.cc:237] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.853506 25705 raft_consensus.cc:493] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:01:54.853379 25632 raft_consensus.cc:2468] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate f5202ea2c8244e849a11073ee5d918c5 in term 1.
I20260430 02:01:54.853582 25705 raft_consensus.cc:515] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.853947 25705 leader_election.cc:290] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:54.853338 25701 tablet_bootstrap.cc:492] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4: No bootstrap required, opened a new log
I20260430 02:01:54.854097 25701 ts_tablet_manager.cc:1403] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4: Time spent bootstrapping tablet: real 0.008s user 0.000s sys 0.002s
I20260430 02:01:54.854262 25709 tablet_bootstrap.cc:492] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5: No bootstrap required, opened a new log
I20260430 02:01:54.854341 25709 ts_tablet_manager.cc:1403] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5: Time spent bootstrapping tablet: real 0.015s user 0.000s sys 0.002s
I20260430 02:01:54.854369 25436 leader_election.cc:304] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5; no voters:
I20260430 02:01:54.854660 25724 raft_consensus.cc:2804] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Leader election won for term 1
I20260430 02:01:54.854636 25701 raft_consensus.cc:359] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.854779 25701 raft_consensus.cc:385] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.854826 25701 raft_consensus.cc:740] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a4ec6dffeb11435b8655672771cd29c4, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.854913 25724 raft_consensus.cc:697] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 1 LEADER]: Becoming Leader. State: Replica: f5202ea2c8244e849a11073ee5d918c5, State: Running, Role: LEADER
I20260430 02:01:54.854888 25709 raft_consensus.cc:359] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.854991 25709 raft_consensus.cc:385] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.854954 25701 consensus_queue.cc:260] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.855026 25709 raft_consensus.cc:740] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f5202ea2c8244e849a11073ee5d918c5, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.855191 25724 consensus_queue.cc:237] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.855365 25701 ts_tablet_manager.cc:1434] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4: Time spent starting tablet: real 0.001s user 0.001s sys 0.000s
I20260430 02:01:54.855692 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:01:54.855816 25632 raft_consensus.cc:2468] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e481cdb11e4040f2b78a224a2bed5eaa in term 0.
I20260430 02:01:54.856150 25305 leader_election.cc:304] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa; no voters:
I20260430 02:01:54.856119 25709 consensus_queue.cc:260] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.856333 25705 raft_consensus.cc:2804] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260430 02:01:54.856410 25705 raft_consensus.cc:493] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:01:54.856452 25705 raft_consensus.cc:3060] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.856688 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:01:54.856966 25716 leader_election.cc:290] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 1 election: Requested vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
W20260430 02:01:54.857021 25306 leader_election.cc:343] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:01:54.857554 25705 raft_consensus.cc:515] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.857997 25705 leader_election.cc:290] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 election: Requested vote from peers f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:54.858222 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5"
I20260430 02:01:54.858367 25709 ts_tablet_manager.cc:1434] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5: Time spent starting tablet: real 0.004s user 0.002s sys 0.000s
W20260430 02:01:54.858546 25306 leader_election.cc:343] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 election: Tablet error from VoteRequest() call to peer f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:01:54.858588 25709 tablet_bootstrap.cc:492] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5: Bootstrap starting.
I20260430 02:01:54.858795 25227 catalog_manager.cc:5671] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 reported cstate change: term changed from 0 to 1, leader changed from <none> to a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67). New cstate: current_term: 1 leader_uuid: "a4ec6dffeb11435b8655672771cd29c4" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: HEALTHY } } }
I20260430 02:01:54.859738 25709 tablet_bootstrap.cc:654] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5: Neither blocks nor log segments found. Creating new log.
I20260430 02:01:54.860448 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4"
I20260430 02:01:54.860581 25632 raft_consensus.cc:3060] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.860963 25709 tablet_bootstrap.cc:492] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5: No bootstrap required, opened a new log
I20260430 02:01:54.861039 25709 ts_tablet_manager.cc:1403] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5: Time spent bootstrapping tablet: real 0.003s user 0.002s sys 0.000s
I20260430 02:01:54.861521 25709 raft_consensus.cc:359] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.861613 25709 raft_consensus.cc:385] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:01:54.861608 25632 raft_consensus.cc:2468] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e481cdb11e4040f2b78a224a2bed5eaa in term 1.
I20260430 02:01:54.861645 25709 raft_consensus.cc:740] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f5202ea2c8244e849a11073ee5d918c5, State: Initialized, Role: FOLLOWER
I20260430 02:01:54.861763 25709 consensus_queue.cc:260] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.862100 25709 ts_tablet_manager.cc:1434] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5: Time spent starting tablet: real 0.001s user 0.001s sys 0.000s
I20260430 02:01:54.862149 25305 leader_election.cc:304] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa; no voters: f5202ea2c8244e849a11073ee5d918c5
I20260430 02:01:54.862488 25705 raft_consensus.cc:2804] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Leader election won for term 1
I20260430 02:01:54.862442 25226 catalog_manager.cc:5671] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 reported cstate change: term changed from 0 to 1, leader changed from <none> to f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66). New cstate: current_term: 1 leader_uuid: "f5202ea2c8244e849a11073ee5d918c5" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: UNKNOWN } } }
I20260430 02:01:54.862572 25705 raft_consensus.cc:697] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 1 LEADER]: Becoming Leader. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Running, Role: LEADER
I20260430 02:01:54.862860 25705 consensus_queue.cc:237] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.864928 25370 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "63c4457448ec4b1b8a0741a7560cddfe" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" is_pre_election: true
I20260430 02:01:54.864998 25369 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "63c4457448ec4b1b8a0741a7560cddfe" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "e481cdb11e4040f2b78a224a2bed5eaa"
I20260430 02:01:54.865056 25370 raft_consensus.cc:2468] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f5202ea2c8244e849a11073ee5d918c5 in term 0.
I20260430 02:01:54.867483 25226 catalog_manager.cc:5671] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa reported cstate change: term changed from 0 to 1, leader changed from <none> to e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65). New cstate: current_term: 1 leader_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: UNKNOWN } } }
I20260430 02:01:54.872574 25733 raft_consensus.cc:493] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:01:54.872711 25733 raft_consensus.cc:515] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.873080 25733 leader_election.cc:290] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:54.873664 25369 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "2c080a5b649f453c903e1dce5ab6a113" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" is_pre_election: true
I20260430 02:01:54.873685 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "2c080a5b649f453c903e1dce5ab6a113" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:01:54.873831 25369 raft_consensus.cc:2468] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a4ec6dffeb11435b8655672771cd29c4 in term 0.
I20260430 02:01:54.873868 25501 raft_consensus.cc:2468] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a4ec6dffeb11435b8655672771cd29c4 in term 0.
I20260430 02:01:54.874218 25568 leader_election.cc:304] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa; no voters:
I20260430 02:01:54.874436 25733 raft_consensus.cc:2804] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260430 02:01:54.874601 25705 raft_consensus.cc:493] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:01:54.874694 25733 raft_consensus.cc:493] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:01:54.874722 25705 raft_consensus.cc:515] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.874789 25733 raft_consensus.cc:3060] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.875120 25705 leader_election.cc:290] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:54.875443 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f1dbbbee06674a93a3dd31c45c90d59d" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:01:54.875492 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f1dbbbee06674a93a3dd31c45c90d59d" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:01:54.875579 25501 raft_consensus.cc:2468] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e481cdb11e4040f2b78a224a2bed5eaa in term 0.
I20260430 02:01:54.875639 25632 raft_consensus.cc:2468] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e481cdb11e4040f2b78a224a2bed5eaa in term 0.
I20260430 02:01:54.875928 25306 leader_election.cc:304] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: e481cdb11e4040f2b78a224a2bed5eaa, f5202ea2c8244e849a11073ee5d918c5; no voters:
I20260430 02:01:54.875986 25733 raft_consensus.cc:515] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.876188 25705 raft_consensus.cc:2804] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260430 02:01:54.876252 25705 raft_consensus.cc:493] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:01:54.876305 25705 raft_consensus.cc:3060] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.876348 25733 leader_election.cc:290] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 1 election: Requested vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:54.876972 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "2c080a5b649f453c903e1dce5ab6a113" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5"
I20260430 02:01:54.876971 25369 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "2c080a5b649f453c903e1dce5ab6a113" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "e481cdb11e4040f2b78a224a2bed5eaa"
I20260430 02:01:54.877105 25369 raft_consensus.cc:3060] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.877107 25501 raft_consensus.cc:3060] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.877161 25705 raft_consensus.cc:515] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.877508 25705 leader_election.cc:290] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 election: Requested vote from peers f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:54.877995 25500 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f1dbbbee06674a93a3dd31c45c90d59d" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5"
I20260430 02:01:54.878096 25500 raft_consensus.cc:3060] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.878135 25369 raft_consensus.cc:2468] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a4ec6dffeb11435b8655672771cd29c4 in term 1.
I20260430 02:01:54.878253 25501 raft_consensus.cc:2468] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a4ec6dffeb11435b8655672771cd29c4 in term 1.
I20260430 02:01:54.878327 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f1dbbbee06674a93a3dd31c45c90d59d" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4"
I20260430 02:01:54.878402 25632 raft_consensus.cc:3060] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.878543 25568 leader_election.cc:304] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa; no voters:
I20260430 02:01:54.878734 25733 raft_consensus.cc:2804] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader election won for term 1
I20260430 02:01:54.878831 25733 raft_consensus.cc:697] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 1 LEADER]: Becoming Leader. State: Replica: a4ec6dffeb11435b8655672771cd29c4, State: Running, Role: LEADER
I20260430 02:01:54.878973 25733 consensus_queue.cc:237] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.879153 25500 raft_consensus.cc:2468] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e481cdb11e4040f2b78a224a2bed5eaa in term 1.
I20260430 02:01:54.879220 25632 raft_consensus.cc:2468] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e481cdb11e4040f2b78a224a2bed5eaa in term 1.
I20260430 02:01:54.879475 25306 leader_election.cc:304] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: e481cdb11e4040f2b78a224a2bed5eaa, f5202ea2c8244e849a11073ee5d918c5; no voters:
I20260430 02:01:54.879657 25705 raft_consensus.cc:2804] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Leader election won for term 1
I20260430 02:01:54.879722 25705 raft_consensus.cc:697] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 1 LEADER]: Becoming Leader. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Running, Role: LEADER
I20260430 02:01:54.879850 25705 consensus_queue.cc:237] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.880897 25226 catalog_manager.cc:5671] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 reported cstate change: term changed from 0 to 1, leader changed from <none> to a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67). New cstate: current_term: 1 leader_uuid: "a4ec6dffeb11435b8655672771cd29c4" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: HEALTHY } } }
I20260430 02:01:54.881443 25225 catalog_manager.cc:5671] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa reported cstate change: term changed from 0 to 1, leader changed from <none> to e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65). New cstate: current_term: 1 leader_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: UNKNOWN } } }
I20260430 02:01:54.901631 25731 raft_consensus.cc:493] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:01:54.901798 25731 raft_consensus.cc:515] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.902364 25731 leader_election.cc:290] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:54.902992 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "869a29991dbe4537bf1082d1c9ee2ecd" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:01:54.903000 25369 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "869a29991dbe4537bf1082d1c9ee2ecd" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" is_pre_election: true
I20260430 02:01:54.903247 25369 raft_consensus.cc:2393] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate f5202ea2c8244e849a11073ee5d918c5 in current term 1: Already voted for candidate a4ec6dffeb11435b8655672771cd29c4 in this term.
I20260430 02:01:54.903636 25437 leader_election.cc:304] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: f5202ea2c8244e849a11073ee5d918c5; no voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa
I20260430 02:01:54.903887 25731 raft_consensus.cc:3060] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.904794 25731 raft_consensus.cc:2749] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20260430 02:01:54.911799 25705 raft_consensus.cc:493] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:01:54.911965 25705 raft_consensus.cc:515] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:54.912417 25705 leader_election.cc:290] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:54.913051 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "29e24069ebd04300a2c4cf4d4bdc5e66" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:01:54.913051 25500 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "29e24069ebd04300a2c4cf4d4bdc5e66" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:01:54.913216 25500 raft_consensus.cc:2468] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e481cdb11e4040f2b78a224a2bed5eaa in term 0.
I20260430 02:01:54.913216 25632 raft_consensus.cc:2468] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e481cdb11e4040f2b78a224a2bed5eaa in term 0.
I20260430 02:01:54.913566 25305 leader_election.cc:304] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa; no voters:
I20260430 02:01:54.913868 25705 raft_consensus.cc:2804] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260430 02:01:54.914003 25705 raft_consensus.cc:493] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:01:54.914064 25705 raft_consensus.cc:3060] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.914927 25705 raft_consensus.cc:515] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:54.915319 25705 leader_election.cc:290] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 election: Requested vote from peers a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:54.916034 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "29e24069ebd04300a2c4cf4d4bdc5e66" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4"
I20260430 02:01:54.916190 25632 raft_consensus.cc:3060] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.916220 25500 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "29e24069ebd04300a2c4cf4d4bdc5e66" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5"
I20260430 02:01:54.916353 25500 raft_consensus.cc:3060] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.917274 25632 raft_consensus.cc:2468] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e481cdb11e4040f2b78a224a2bed5eaa in term 1.
I20260430 02:01:54.917326 25500 raft_consensus.cc:2468] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e481cdb11e4040f2b78a224a2bed5eaa in term 1.
I20260430 02:01:54.917734 25305 leader_election.cc:304] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa; no voters:
I20260430 02:01:54.918223 25705 raft_consensus.cc:2804] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Leader election won for term 1
I20260430 02:01:54.918342 25705 raft_consensus.cc:697] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 LEADER]: Becoming Leader. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Running, Role: LEADER
I20260430 02:01:54.918502 25705 consensus_queue.cc:237] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:54.920477 25225 catalog_manager.cc:5671] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa reported cstate change: term changed from 0 to 1, leader changed from <none> to e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65). New cstate: current_term: 1 leader_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: UNKNOWN } } }
I20260430 02:01:54.937848 25724 raft_consensus.cc:493] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:01:54.938087 25724 raft_consensus.cc:515] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:54.938692 25724 leader_election.cc:290] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:54.938984 25369 raft_consensus.cc:3060] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.939126 25370 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" is_pre_election: true
I20260430 02:01:54.939190 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:01:54.939384 25632 raft_consensus.cc:2393] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate f5202ea2c8244e849a11073ee5d918c5 in current term 1: Already voted for candidate e481cdb11e4040f2b78a224a2bed5eaa in this term.
I20260430 02:01:54.939679 25436 leader_election.cc:304] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: f5202ea2c8244e849a11073ee5d918c5; no voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa
I20260430 02:01:54.939926 25724 raft_consensus.cc:3060] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:01:54.940847 25724 raft_consensus.cc:2749] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20260430 02:01:54.940817 25731 consensus_queue.cc:1048] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260430 02:01:54.946530 25724 consensus_queue.cc:1048] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260430 02:01:54.951960 25733 consensus_queue.cc:1048] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20260430 02:01:54.953477 25679 tablet.cc:2404] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20260430 02:01:54.954100 25679 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20260430 02:01:54.957554 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.68:0
--local_ip_for_outbound_sockets=127.24.153.68
--webserver_interface=127.24.153.68
--webserver_port=0
--tserver_master_addrs=127.24.153.126:40723
--builtin_ntp_servers=127.24.153.84:39365
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20260430 02:01:54.958174 25733 consensus_queue.cc:1048] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Connected to new peer: Peer: permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260430 02:01:54.964432 25733 consensus_queue.cc:1048] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Connected to new peer: Peer: permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260430 02:01:54.967350 25733 consensus_queue.cc:1048] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260430 02:01:54.979355 25705 consensus_queue.cc:1048] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [LEADER]: Connected to new peer: Peer: permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260430 02:01:54.980973 25705 consensus_queue.cc:1048] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [LEADER]: Connected to new peer: Peer: permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260430 02:01:54.982791 25705 consensus_queue.cc:1048] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [LEADER]: Connected to new peer: Peer: permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20260430 02:01:54.985692 25548 tablet.cc:2404] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20260430 02:01:54.986274 25548 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20260430 02:01:54.987183 25707 consensus_queue.cc:1048] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [LEADER]: Connected to new peer: Peer: permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260430 02:01:55.014659 25632 raft_consensus.cc:1275] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Refusing update from remote peer e481cdb11e4040f2b78a224a2bed5eaa: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260430 02:01:55.018471 25500 raft_consensus.cc:1275] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Refusing update from remote peer e481cdb11e4040f2b78a224a2bed5eaa: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260430 02:01:55.019518 25705 consensus_queue.cc:1048] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [LEADER]: Connected to new peer: Peer: permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260430 02:01:55.029978 25707 consensus_queue.cc:1048] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [LEADER]: Connected to new peer: Peer: permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260430 02:01:55.033898 25764 mvcc.cc:204] Tried to move back new op lower bound from 7280699453514461184 to 7280699453108031488. Current Snapshot: MvccSnapshot[applied={T|T < 7280699453484171264}]
I20260430 02:01:55.040555 25746 mvcc.cc:204] Tried to move back new op lower bound from 7280699453514461184 to 7280699453108031488. Current Snapshot: MvccSnapshot[applied={T|T < 7280699453484171264}]
W20260430 02:01:55.045305 25417 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20260430 02:01:55.051913 25743 mvcc.cc:204] Tried to move back new op lower bound from 7280699453514461184 to 7280699453108031488. Current Snapshot: MvccSnapshot[applied={T|T < 7280699453484171264}]
W20260430 02:01:55.219827 25752 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:01:55.220288 25752 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:01:55.220394 25752 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260430 02:01:55.220489 25752 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:01:55.235558 25752 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:01:55.245687 25752 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.68
I20260430 02:01:55.252286 25752 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:39365
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.68:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.24.153.68
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.24.153.126:40723
--never_fsync=true
--heap_profile_path=/tmp/kudu.25752
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.68
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:01:55.253878 25752 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:01:55.255523 25752 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:01:55.262228 25752 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:01:55.348270 25806 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:01:55.348753 25804 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:01:55.350329 25752 server_base.cc:1061] running on GCE node
W20260430 02:01:55.362982 25803 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:01:55.365123 25752 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:01:55.365993 25752 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:01:55.375905 25752 hybrid_clock.cc:648] HybridClock initialized: now 1777514515375716 us; error 190 us; skew 500 ppm
I20260430 02:01:55.379387 25752 webserver.cc:492] Webserver started at http://127.24.153.68:39511/ using document root <none> and password file <none>
I20260430 02:01:55.380298 25752 fs_manager.cc:362] Metadata directory not provided
I20260430 02:01:55.380704 25752 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:01:55.386363 25752 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260430 02:01:55.389117 25752 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/data/instance:
uuid: "7f82aeb1ab544f90b83142774323b0a3"
format_stamp: "Formatted at 2026-04-30 02:01:55 on dist-test-slave-f7mg"
I20260430 02:01:55.389804 25752 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/wal/instance:
uuid: "7f82aeb1ab544f90b83142774323b0a3"
format_stamp: "Formatted at 2026-04-30 02:01:55 on dist-test-slave-f7mg"
I20260430 02:01:55.406203 25752 fs_manager.cc:696] Time spent creating directory manager: real 0.016s user 0.004s sys 0.000s
I20260430 02:01:55.497644 25812 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:55.499382 25752 fs_manager.cc:730] Time spent opening block manager: real 0.068s user 0.002s sys 0.001s
I20260430 02:01:55.499545 25752 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/wal
uuid: "7f82aeb1ab544f90b83142774323b0a3"
format_stamp: "Formatted at 2026-04-30 02:01:55 on dist-test-slave-f7mg"
I20260430 02:01:55.499675 25752 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:01:55.551909 25752 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:01:55.552733 25752 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:01:55.552915 25752 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:01:55.553587 25752 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:01:55.564139 25752 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:01:55.564231 25752 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:55.564371 25752 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:01:55.564419 25752 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:55.579798 25752 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.68:45771
I20260430 02:01:55.581308 25752 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
I20260430 02:01:55.591636 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 25752
I20260430 02:01:55.591738 25189 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-3/wal/instance
I20260430 02:01:55.596108 25925 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.68:45771 every 8 connection(s)
I20260430 02:01:55.613862 25926 heartbeater.cc:344] Connected to a master server at 127.24.153.126:40723
I20260430 02:01:55.614214 25926 heartbeater.cc:461] Registering TS with master...
I20260430 02:01:55.615442 25926 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:01:55.616916 25222 ts_manager.cc:194] Registered new tserver with Master: 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771)
I20260430 02:01:55.617664 25222 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.68:57097
I20260430 02:01:55.950749 25222 ts_manager.cc:295] Set tserver state for e481cdb11e4040f2b78a224a2bed5eaa to MAINTENANCE_MODE
I20260430 02:01:55.951442 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 25288
W20260430 02:01:55.964455 25437 connection.cc:570] client connection to 127.24.153.65:44971 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260430 02:01:55.964653 25437 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260430 02:01:55.964923 25568 connection.cc:570] client connection to 127.24.153.65:44971 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260430 02:01:55.965049 25568 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260430 02:01:55.965392 25568 connection.cc:570] server connection from 127.24.153.65:56837 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260430 02:01:55.965744 25689 connection.cc:570] client connection to 127.24.153.65:44971 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260430 02:01:55.965883 25689 meta_cache.cc:302] tablet fbfe4b6b54594e36a76de6e54d2adb8c: replica e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971) has failed: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260430 02:01:55.966969 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:01:55.966995 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:01:55.967096 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:01:55.968245 25459 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.968246 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.968246 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.970836 25459 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.970841 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.970930 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.971450 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.972118 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.972141 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.972419 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.972452 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.974529 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.974606 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.975663 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.975813 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.976209 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.977042 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.978412 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.979516 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.979681 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.988296 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.989804 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.996662 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.996655 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.999032 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:55.999075 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.001694 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.004036 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.005333 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.005344 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.011193 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.011508 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.012434 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.013833 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.013833 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.016057 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.017270 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.027349 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.027433 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.030936 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.031630 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.032011 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.034159 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.036496 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.038296 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.041719 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.041719 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.045475 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.045693 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.054481 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.056231 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.058234 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.058374 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.066629 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.067797 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.067798 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.068379 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.074331 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.077648 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.079871 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.081229 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.081439 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.085839 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.087034 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.091394 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.091400 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.096772 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.098997 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.099098 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.100301 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.102751 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.115684 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.115727 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.116458 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.116737 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.120122 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.129016 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.134750 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.137059 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.145282 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.145344 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.156132 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.156183 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.156744 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.156816 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.156839 25459 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.157434 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.160806 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.166137 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.176054 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.177340 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.177788 25592 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.178069 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.178447 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.198643 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.200670 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.208330 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.209555 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.220821 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.221228 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.224432 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
I20260430 02:01:56.226150 25795 raft_consensus.cc:493] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader e481cdb11e4040f2b78a224a2bed5eaa)
I20260430 02:01:56.226287 25795 raft_consensus.cc:515] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:56.226764 25770 raft_consensus.cc:493] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader e481cdb11e4040f2b78a224a2bed5eaa)
I20260430 02:01:56.226850 25795 leader_election.cc:290] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:56.226869 25770 raft_consensus.cc:515] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:56.227218 25770 leader_election.cc:290] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:56.227367 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 2 candidate_status { last_received { term: 1 index: 68 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:01:56.227471 25632 raft_consensus.cc:2468] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f5202ea2c8244e849a11073ee5d918c5 in term 1.
I20260430 02:01:56.227646 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 2 candidate_status { last_received { term: 1 index: 68 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:01:56.227783 25501 raft_consensus.cc:2468] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a4ec6dffeb11435b8655672771cd29c4 in term 1.
W20260430 02:01:56.227994 25437 leader_election.cc:336] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
I20260430 02:01:56.228243 25568 leader_election.cc:304] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5; no voters:
I20260430 02:01:56.228281 25436 leader_election.cc:304] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5; no voters: e481cdb11e4040f2b78a224a2bed5eaa
W20260430 02:01:56.228392 25568 leader_election.cc:336] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
I20260430 02:01:56.228433 25770 raft_consensus.cc:2804] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20260430 02:01:56.228484 25770 raft_consensus.cc:493] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Starting leader election (detected failure of leader e481cdb11e4040f2b78a224a2bed5eaa)
I20260430 02:01:56.228525 25770 raft_consensus.cc:3060] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Advancing to term 2
I20260430 02:01:56.229332 25770 raft_consensus.cc:515] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:56.229597 25770 leader_election.cc:290] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 election: Requested vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:56.229743 25795 raft_consensus.cc:2804] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20260430 02:01:56.229830 25795 raft_consensus.cc:493] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Starting leader election (detected failure of leader e481cdb11e4040f2b78a224a2bed5eaa)
I20260430 02:01:56.229904 25795 raft_consensus.cc:3060] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Advancing to term 2
I20260430 02:01:56.230389 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 2 candidate_status { last_received { term: 1 index: 68 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5"
W20260430 02:01:56.230917 25568 leader_election.cc:336] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
I20260430 02:01:56.231475 25795 raft_consensus.cc:515] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:56.231838 25501 raft_consensus.cc:2393] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate a4ec6dffeb11435b8655672771cd29c4 in current term 2: Already voted for candidate f5202ea2c8244e849a11073ee5d918c5 in this term.
I20260430 02:01:56.231997 25795 leader_election.cc:290] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 election: Requested vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:56.232409 25568 leader_election.cc:304] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4; no voters: e481cdb11e4040f2b78a224a2bed5eaa, f5202ea2c8244e849a11073ee5d918c5
I20260430 02:01:56.232759 25770 raft_consensus.cc:2749] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
I20260430 02:01:56.232896 25801 raft_consensus.cc:493] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader e481cdb11e4040f2b78a224a2bed5eaa)
I20260430 02:01:56.232980 25801 raft_consensus.cc:515] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:56.233227 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 2 candidate_status { last_received { term: 1 index: 68 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4"
W20260430 02:01:56.233302 25437 leader_election.cc:336] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
I20260430 02:01:56.233343 25632 raft_consensus.cc:2393] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate f5202ea2c8244e849a11073ee5d918c5 in current term 2: Already voted for candidate a4ec6dffeb11435b8655672771cd29c4 in this term.
I20260430 02:01:56.233385 25801 leader_election.cc:290] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
W20260430 02:01:56.233482 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
I20260430 02:01:56.233706 25436 leader_election.cc:304] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: f5202ea2c8244e849a11073ee5d918c5; no voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa
I20260430 02:01:56.234032 25795 raft_consensus.cc:2749] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
I20260430 02:01:56.234210 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f1dbbbee06674a93a3dd31c45c90d59d" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 2 candidate_status { last_received { term: 1 index: 68 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
W20260430 02:01:56.234730 25568 leader_election.cc:336] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
I20260430 02:01:56.234792 25568 leader_election.cc:304] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4; no voters: e481cdb11e4040f2b78a224a2bed5eaa, f5202ea2c8244e849a11073ee5d918c5
I20260430 02:01:56.234966 25801 raft_consensus.cc:2749] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260430 02:01:56.235785 25795 raft_consensus.cc:493] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader e481cdb11e4040f2b78a224a2bed5eaa)
I20260430 02:01:56.235873 25795 raft_consensus.cc:515] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:56.236271 25795 leader_election.cc:290] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:56.236720 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f1dbbbee06674a93a3dd31c45c90d59d" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 2 candidate_status { last_received { term: 1 index: 68 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:01:56.236876 25632 raft_consensus.cc:2468] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f5202ea2c8244e849a11073ee5d918c5 in term 1.
I20260430 02:01:56.237177 25436 leader_election.cc:304] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5; no voters:
W20260430 02:01:56.237289 25437 leader_election.cc:336] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
I20260430 02:01:56.237365 25795 raft_consensus.cc:2804] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20260430 02:01:56.237416 25795 raft_consensus.cc:493] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Starting leader election (detected failure of leader e481cdb11e4040f2b78a224a2bed5eaa)
I20260430 02:01:56.237461 25795 raft_consensus.cc:3060] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Advancing to term 2
I20260430 02:01:56.238344 25795 raft_consensus.cc:515] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:56.238688 25795 leader_election.cc:290] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 election: Requested vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:56.239106 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f1dbbbee06674a93a3dd31c45c90d59d" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 2 candidate_status { last_received { term: 1 index: 68 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4"
I20260430 02:01:56.239244 25632 raft_consensus.cc:3060] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Advancing to term 2
W20260430 02:01:56.239554 25437 leader_election.cc:336] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
I20260430 02:01:56.240360 25632 raft_consensus.cc:2468] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate f5202ea2c8244e849a11073ee5d918c5 in term 2.
I20260430 02:01:56.240784 25436 leader_election.cc:304] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5; no voters: e481cdb11e4040f2b78a224a2bed5eaa
I20260430 02:01:56.240953 25795 raft_consensus.cc:2804] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 2 FOLLOWER]: Leader election won for term 2
I20260430 02:01:56.241042 25795 raft_consensus.cc:697] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 2 LEADER]: Becoming Leader. State: Replica: f5202ea2c8244e849a11073ee5d918c5, State: Running, Role: LEADER
I20260430 02:01:56.241187 25795 consensus_queue.cc:237] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 68, Committed index: 68, Last appended: 1.68, Last appended by leader: 68, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:56.242878 25224 catalog_manager.cc:5671] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 reported cstate change: term changed from 1 to 2, leader changed from e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65) to f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66). New cstate: current_term: 2 leader_uuid: "f5202ea2c8244e849a11073ee5d918c5" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: UNKNOWN } } }
W20260430 02:01:56.244344 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
I20260430 02:01:56.244763 25630 raft_consensus.cc:1275] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Refusing update from remote peer f5202ea2c8244e849a11073ee5d918c5: Log matching property violated. Preceding OpId in replica: term: 1 index: 68. Preceding OpId from leader: term: 2 index: 70. (index mismatch)
I20260430 02:01:56.245363 25731 consensus_queue.cc:1048] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 69, Last known committed idx: 68, Time since last communication: 0.000s
W20260430 02:01:56.245483 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:01:56.247819 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.249276 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
I20260430 02:01:56.249743 25796 raft_consensus.cc:493] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader e481cdb11e4040f2b78a224a2bed5eaa)
I20260430 02:01:56.249902 25796 raft_consensus.cc:515] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:56.250344 25796 leader_election.cc:290] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:01:56.250667 25630 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "29e24069ebd04300a2c4cf4d4bdc5e66" candidate_uuid: "f5202ea2c8244e849a11073ee5d918c5" candidate_term: 2 candidate_status { last_received { term: 1 index: 70 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
W20260430 02:01:56.251945 25437 leader_election.cc:336] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
I20260430 02:01:56.252048 25437 leader_election.cc:304] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: f5202ea2c8244e849a11073ee5d918c5; no voters: a4ec6dffeb11435b8655672771cd29c4, e481cdb11e4040f2b78a224a2bed5eaa
I20260430 02:01:56.252200 25801 raft_consensus.cc:493] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader e481cdb11e4040f2b78a224a2bed5eaa)
I20260430 02:01:56.252274 25795 raft_consensus.cc:2749] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260430 02:01:56.252290 25801 raft_consensus.cc:515] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:56.252668 25801 leader_election.cc:290] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
W20260430 02:01:56.253875 25568 leader_election.cc:336] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
W20260430 02:01:56.254348 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
I20260430 02:01:56.255049 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "29e24069ebd04300a2c4cf4d4bdc5e66" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 2 candidate_status { last_received { term: 1 index: 70 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
W20260430 02:01:56.255124 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
I20260430 02:01:56.255180 25501 raft_consensus.cc:2468] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a4ec6dffeb11435b8655672771cd29c4 in term 1.
I20260430 02:01:56.255472 25568 leader_election.cc:304] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5; no voters: e481cdb11e4040f2b78a224a2bed5eaa
I20260430 02:01:56.255638 25801 raft_consensus.cc:2804] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20260430 02:01:56.255705 25801 raft_consensus.cc:493] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Starting leader election (detected failure of leader e481cdb11e4040f2b78a224a2bed5eaa)
I20260430 02:01:56.255735 25801 raft_consensus.cc:3060] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Advancing to term 2
I20260430 02:01:56.257047 25801 raft_consensus.cc:515] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:56.257409 25801 leader_election.cc:290] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 election: Requested vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:56.258107 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "29e24069ebd04300a2c4cf4d4bdc5e66" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 2 candidate_status { last_received { term: 1 index: 70 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5"
I20260430 02:01:56.258234 25501 raft_consensus.cc:3060] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Advancing to term 2
W20260430 02:01:56.258558 25568 leader_election.cc:336] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
I20260430 02:01:56.259199 25501 raft_consensus.cc:2468] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a4ec6dffeb11435b8655672771cd29c4 in term 2.
I20260430 02:01:56.259568 25568 leader_election.cc:304] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5; no voters: e481cdb11e4040f2b78a224a2bed5eaa
I20260430 02:01:56.259771 25801 raft_consensus.cc:2804] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Leader election won for term 2
I20260430 02:01:56.259918 25801 raft_consensus.cc:697] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 2 LEADER]: Becoming Leader. State: Replica: a4ec6dffeb11435b8655672771cd29c4, State: Running, Role: LEADER
I20260430 02:01:56.260073 25801 consensus_queue.cc:237] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 69, Committed index: 69, Last appended: 1.70, Last appended by leader: 70, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:01:56.261771 25222 catalog_manager.cc:5671] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 reported cstate change: term changed from 1 to 2, leader changed from e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65) to a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67). New cstate: current_term: 2 leader_uuid: "a4ec6dffeb11435b8655672771cd29c4" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: UNKNOWN } } }
W20260430 02:01:56.270694 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
I20260430 02:01:56.279153 25501 raft_consensus.cc:1275] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 2 FOLLOWER]: Refusing update from remote peer a4ec6dffeb11435b8655672771cd29c4: Log matching property violated. Preceding OpId in replica: term: 1 index: 70. Preceding OpId from leader: term: 2 index: 72. (index mismatch)
W20260430 02:01:56.279801 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260430 02:01:56.279814 25801 consensus_queue.cc:1048] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 71, Last known committed idx: 70, Time since last communication: 0.000s
W20260430 02:01:56.280778 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.296902 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.302846 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.320866 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.326468 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.327649 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.338227 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.349660 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.368065 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.382767 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.385841 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.408219 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.413991 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.415619 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.429169 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.432497 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:01:56.432497 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:01:56.440686 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:01:56.440709 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.458895 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.478062 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.478864 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.503329 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.512042 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.512810 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.527442 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.535024 25587 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55472: Illegal state: replica a4ec6dffeb11435b8655672771cd29c4 is not leader of this config: current role FOLLOWER
I20260430 02:01:56.545444 25770 raft_consensus.cc:493] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:01:56.545614 25770 raft_consensus.cc:515] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:56.546128 25770 leader_election.cc:290] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:56.546829 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 3 candidate_status { last_received { term: 1 index: 68 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:01:56.546976 25501 raft_consensus.cc:2468] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a4ec6dffeb11435b8655672771cd29c4 in term 2.
I20260430 02:01:56.547262 25568 leader_election.cc:304] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5; no voters:
I20260430 02:01:56.547449 25770 raft_consensus.cc:2804] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Leader pre-election won for term 3
I20260430 02:01:56.547513 25770 raft_consensus.cc:493] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:01:56.547585 25770 raft_consensus.cc:3060] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Advancing to term 3
W20260430 02:01:56.547869 25568 leader_election.cc:336] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 3 pre-election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
I20260430 02:01:56.548770 25770 raft_consensus.cc:515] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 3 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:56.549156 25770 leader_election.cc:290] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 3 election: Requested vote from peers e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:01:56.549664 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "a4ec6dffeb11435b8655672771cd29c4" candidate_term: 3 candidate_status { last_received { term: 1 index: 68 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5"
I20260430 02:01:56.549816 25501 raft_consensus.cc:3060] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 2 FOLLOWER]: Advancing to term 3
W20260430 02:01:56.550302 25568 leader_election.cc:336] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 3 election: RPC error from VoteRequest() call to peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111)
I20260430 02:01:56.551039 25501 raft_consensus.cc:2468] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a4ec6dffeb11435b8655672771cd29c4 in term 3.
I20260430 02:01:56.551453 25568 leader_election.cc:304] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5; no voters: e481cdb11e4040f2b78a224a2bed5eaa
I20260430 02:01:56.551636 25801 raft_consensus.cc:2804] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 3 FOLLOWER]: Leader election won for term 3
I20260430 02:01:56.551798 25801 raft_consensus.cc:697] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 3 LEADER]: Becoming Leader. State: Replica: a4ec6dffeb11435b8655672771cd29c4, State: Running, Role: LEADER
I20260430 02:01:56.551946 25801 consensus_queue.cc:237] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 68, Committed index: 68, Last appended: 1.68, Last appended by leader: 68, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:01:56.553573 25224 catalog_manager.cc:5671] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 reported cstate change: term changed from 1 to 3, leader changed from e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65) to a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67). New cstate: current_term: 3 leader_uuid: "a4ec6dffeb11435b8655672771cd29c4" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: HEALTHY } } }
W20260430 02:01:56.559813 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
I20260430 02:01:56.574141 25501 raft_consensus.cc:1275] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 3 FOLLOWER]: Refusing update from remote peer a4ec6dffeb11435b8655672771cd29c4: Log matching property violated. Preceding OpId in replica: term: 1 index: 68. Preceding OpId from leader: term: 3 index: 70. (index mismatch)
W20260430 02:01:56.574831 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260430 02:01:56.574862 25801 consensus_queue.cc:1048] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 69, Last known committed idx: 68, Time since last communication: 0.000s
I20260430 02:01:56.621150 25926 heartbeater.cc:499] Master 127.24.153.126:40723 was elected leader, sending a full tablet report...
W20260430 02:01:56.737325 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:01:56.756116 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.764122 25460 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.768467 25461 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58914: Illegal state: replica f5202ea2c8244e849a11073ee5d918c5 is not leader of this config: current role FOLLOWER
W20260430 02:01:56.837765 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:01:56.936089 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:01:56.959405 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:01:56.995906 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:01:57.077149 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:01:57.253634 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:01:57.360838 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:01:57.423069 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260430 02:01:57.503868 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260430 02:01:57.534770 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260430 02:01:57.601516 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:01:57.770716 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260430 02:01:57.912324 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260430 02:01:57.914526 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260430 02:01:57.932986 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260430 02:01:58.003253 25795 consensus_queue.cc:579] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.061s)
W20260430 02:01:58.007725 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260430 02:01:58.008373 25956 consensus_queue.cc:579] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.067s)
I20260430 02:01:58.028355 25951 consensus_queue.cc:579] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.076s)
W20260430 02:01:58.184927 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
I20260430 02:01:58.300905 25796 consensus_queue.cc:579] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.060s)
W20260430 02:01:58.304922 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260430 02:01:58.315485 25959 consensus_queue.cc:579] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.055s)
W20260430 02:01:58.415305 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260430 02:01:58.463671 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260430 02:01:58.467545 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260430 02:01:58.492914 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
I20260430 02:01:58.656985 25801 consensus_queue.cc:579] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.105s)
W20260430 02:01:58.668310 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260430 02:01:58.807617 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260430 02:01:58.933982 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260430 02:01:58.969220 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260430 02:01:59.033104 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260430 02:01:59.049968 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260430 02:01:59.073310 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 25197
I20260430 02:01:59.104979 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.24.153.126:40723
--webserver_interface=127.24.153.126
--webserver_port=38027
--builtin_ntp_servers=127.24.153.84:39365
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.24.153.126:40723 with env {}
W20260430 02:01:59.161301 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260430 02:01:59.282446 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260430 02:01:59.307698 25974 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:01:59.308127 25974 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:01:59.308240 25974 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:01:59.311344 25547 heartbeater.cc:646] Failed to heartbeat to 127.24.153.126:40723 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.24.153.126:40723: connect: Connection refused (error 111)
W20260430 02:01:59.313376 25974 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260430 02:01:59.313485 25974 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:01:59.313524 25974 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260430 02:01:59.313614 25974 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260430 02:01:59.320060 25974 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:39365
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.24.153.126:40723
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.24.153.126:40723
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.24.153.126
--webserver_port=38027
--never_fsync=true
--heap_profile_path=/tmp/kudu.25974
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:01:59.321904 25974 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:01:59.323650 25974 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260430 02:01:59.331532 25982 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:01:59.332140 25980 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:01:59.332633 25974 server_base.cc:1061] running on GCE node
W20260430 02:01:59.332957 25979 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:01:59.333673 25974 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:01:59.334932 25974 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:01:59.345928 25974 hybrid_clock.cc:648] HybridClock initialized: now 1777514519345831 us; error 85 us; skew 500 ppm
I20260430 02:01:59.348876 25974 webserver.cc:492] Webserver started at http://127.24.153.126:38027/ using document root <none> and password file <none>
I20260430 02:01:59.349785 25974 fs_manager.cc:362] Metadata directory not provided
I20260430 02:01:59.350000 25974 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:01:59.364226 25974 fs_manager.cc:714] Time spent opening directory manager: real 0.012s user 0.002s sys 0.002s
I20260430 02:01:59.371738 25988 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:01:59.373381 25974 fs_manager.cc:730] Time spent opening block manager: real 0.008s user 0.001s sys 0.002s
I20260430 02:01:59.373538 25974 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal
uuid: "10e131cec14c4db9a800c3f181cedd66"
format_stamp: "Formatted at 2026-04-30 02:01:53 on dist-test-slave-f7mg"
I20260430 02:01:59.374148 25974 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:01:59.410554 25974 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:01:59.411536 25974 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:01:59.411991 25974 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:01:59.460748 26041 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.126:40723 every 8 connection(s)
I20260430 02:01:59.460748 25974 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.126:40723
I20260430 02:01:59.462735 25974 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/master-0/data/info.pb
I20260430 02:01:59.470764 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 25974
W20260430 02:01:59.478439 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20260430 02:01:59.485109 26042 sys_catalog.cc:263] Verifying existing consensus state
I20260430 02:01:59.499548 26042 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66: Bootstrap starting.
I20260430 02:01:59.539475 26042 log.cc:826] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66: Log is configured to *not* fsync() on all Append() calls
I20260430 02:01:59.559732 26042 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66: Bootstrap replayed 1/1 log segments. Stats: ops{read=16 overwritten=0 applied=16 ignored=0} inserts{seen=11 ignored=0} mutations{seen=15 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260430 02:01:59.561141 26042 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66: Bootstrap complete.
I20260430 02:01:59.569314 26042 raft_consensus.cc:359] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } }
W20260430 02:01:59.569983 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20260430 02:01:59.570017 26042 raft_consensus.cc:740] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 10e131cec14c4db9a800c3f181cedd66, State: Initialized, Role: FOLLOWER
I20260430 02:01:59.570744 26042 consensus_queue.cc:260] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 16, Last appended: 1.16, Last appended by leader: 16, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } }
I20260430 02:01:59.571007 26042 raft_consensus.cc:399] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260430 02:01:59.571127 26042 raft_consensus.cc:493] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260430 02:01:59.571303 26042 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 1 FOLLOWER]: Advancing to term 2
I20260430 02:01:59.581802 26042 raft_consensus.cc:515] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } }
I20260430 02:01:59.582353 26042 leader_election.cc:304] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 10e131cec14c4db9a800c3f181cedd66; no voters:
I20260430 02:01:59.582902 26042 leader_election.cc:290] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [CANDIDATE]: Term 2 election: Requested vote from peers
I20260430 02:01:59.583317 26045 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 2 FOLLOWER]: Leader election won for term 2
I20260430 02:01:59.583619 26045 raft_consensus.cc:697] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [term 2 LEADER]: Becoming Leader. State: Replica: 10e131cec14c4db9a800c3f181cedd66, State: Running, Role: LEADER
I20260430 02:01:59.583990 26045 consensus_queue.cc:237] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 16, Committed index: 16, Last appended: 1.16, Last appended by leader: 16, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } }
I20260430 02:01:59.584266 26042 sys_catalog.cc:565] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [sys.catalog]: configured and running, proceeding with master startup.
I20260430 02:01:59.585049 26045 sys_catalog.cc:455] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "10e131cec14c4db9a800c3f181cedd66" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } } }
I20260430 02:01:59.585263 26045 sys_catalog.cc:458] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [sys.catalog]: This master's current role is: LEADER
I20260430 02:01:59.585534 26045 sys_catalog.cc:455] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 10e131cec14c4db9a800c3f181cedd66. Latest consensus state: current_term: 2 leader_uuid: "10e131cec14c4db9a800c3f181cedd66" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "10e131cec14c4db9a800c3f181cedd66" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 40723 } } }
I20260430 02:01:59.585702 26045 sys_catalog.cc:458] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66 [sys.catalog]: This master's current role is: LEADER
W20260430 02:01:59.585902 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20260430 02:01:59.586313 26053 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
W20260430 02:01:59.588056 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260430 02:01:59.592335 26053 catalog_manager.cc:679] Loaded metadata for table test-workload [id=86672283a7964500ab7eab1ca56ae85d]
I20260430 02:01:59.593761 26053 tablet_loader.cc:96] loaded metadata for tablet 29e24069ebd04300a2c4cf4d4bdc5e66 (table test-workload [id=86672283a7964500ab7eab1ca56ae85d])
I20260430 02:01:59.594180 26053 tablet_loader.cc:96] loaded metadata for tablet 2c080a5b649f453c903e1dce5ab6a113 (table test-workload [id=86672283a7964500ab7eab1ca56ae85d])
I20260430 02:01:59.594509 26053 tablet_loader.cc:96] loaded metadata for tablet 63c4457448ec4b1b8a0741a7560cddfe (table test-workload [id=86672283a7964500ab7eab1ca56ae85d])
I20260430 02:01:59.594830 26053 tablet_loader.cc:96] loaded metadata for tablet 869a29991dbe4537bf1082d1c9ee2ecd (table test-workload [id=86672283a7964500ab7eab1ca56ae85d])
I20260430 02:01:59.595088 26053 tablet_loader.cc:96] loaded metadata for tablet f1dbbbee06674a93a3dd31c45c90d59d (table test-workload [id=86672283a7964500ab7eab1ca56ae85d])
I20260430 02:01:59.595346 26053 tablet_loader.cc:96] loaded metadata for tablet fbfe4b6b54594e36a76de6e54d2adb8c (table test-workload [id=86672283a7964500ab7eab1ca56ae85d])
I20260430 02:01:59.595518 26053 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260430 02:01:59.596313 26053 catalog_manager.cc:1269] Loaded cluster ID: 7aa1ab3da6514d7f8f19e5fdffeb4a89
I20260430 02:01:59.596416 26053 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260430 02:01:59.610342 26053 catalog_manager.cc:1514] Loading token signing keys...
I20260430 02:01:59.611801 26053 catalog_manager.cc:6055] T 00000000000000000000000000000000 P 10e131cec14c4db9a800c3f181cedd66: Loaded TSK: 0
I20260430 02:01:59.613232 26053 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260430 02:01:59.643875 26003 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" instance_seqno: 1777514515576017) as {username='slave'} at 127.24.153.68:38629; Asking this server to re-register.
I20260430 02:01:59.645201 25926 heartbeater.cc:461] Registering TS with master...
I20260430 02:01:59.645398 25926 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:01:59.646561 26003 ts_manager.cc:194] Registered new tserver with Master: 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771)
W20260430 02:01:59.669869 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260430 02:01:59.680042 26003 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" instance_seqno: 1777514514698596) as {username='slave'} at 127.24.153.67:52579; Asking this server to re-register.
I20260430 02:01:59.681094 25678 heartbeater.cc:461] Registering TS with master...
I20260430 02:01:59.681313 25678 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:01:59.685297 26003 ts_manager.cc:194] Registered new tserver with Master: a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
W20260430 02:01:59.765087 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260430 02:01:59.976508 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260430 02:02:00.036860 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260430 02:02:00.075414 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260430 02:02:00.133170 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260430 02:02:00.133808 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260430 02:02:00.254598 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
I20260430 02:02:00.355168 25547 heartbeater.cc:344] Connected to a master server at 127.24.153.126:40723
I20260430 02:02:00.362762 26003 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" instance_seqno: 1777514514479987) as {username='slave'} at 127.24.153.66:39545; Asking this server to re-register.
I20260430 02:02:00.364109 25547 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:00.364307 25547 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:02:00.365679 26003 ts_manager.cc:194] Registered new tserver with Master: f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
W20260430 02:02:00.512586 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260430 02:02:00.584015 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260430 02:02:00.636785 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260430 02:02:00.687310 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260430 02:02:00.714841 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260430 02:02:00.744899 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260430 02:02:01.008769 25437 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111) [suppressed 100 similar messages]
W20260430 02:02:01.038475 25568 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111) [suppressed 193 similar messages]
W20260430 02:02:01.039222 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260430 02:02:01.089154 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260430 02:02:01.158493 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260430 02:02:01.213851 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260430 02:02:01.253475 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260430 02:02:01.277034 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260430 02:02:01.535261 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260430 02:02:01.672693 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260430 02:02:01.695111 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260430 02:02:01.744917 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260430 02:02:01.758621 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260430 02:02:01.787307 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260430 02:02:02.059670 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20260430 02:02:02.192714 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20260430 02:02:02.281747 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260430 02:02:02.293068 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260430 02:02:02.305857 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20260430 02:02:02.316363 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
I20260430 02:02:02.509729 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.65:44971
--local_ip_for_outbound_sockets=127.24.153.65
--tserver_master_addrs=127.24.153.126:40723
--webserver_port=38833
--webserver_interface=127.24.153.65
--builtin_ntp_servers=127.24.153.84:39365
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20260430 02:02:02.632073 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 66: this message will repeat every 5th retry.
W20260430 02:02:02.670827 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 66: this message will repeat every 5th retry.
W20260430 02:02:02.769263 26074 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:02.769699 26074 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:02.769819 26074 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260430 02:02:02.769960 26074 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:02.776428 26074 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:02.776727 26074 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.65
I20260430 02:02:02.785938 26074 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:39365
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.65:44971
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.24.153.65
--webserver_port=38833
--enable_log_gc=false
--tserver_master_addrs=127.24.153.126:40723
--never_fsync=true
--heap_profile_path=/tmp/kudu.26074
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.65
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:02.788131 26074 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:02.789827 26074 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:02.793777 26074 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:02.801611 26088 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:02.802573 26086 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:02.804289 26085 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:02.806061 26074 server_base.cc:1061] running on GCE node
I20260430 02:02:02.806569 26074 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:02.807289 26074 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:02.811124 26074 hybrid_clock.cc:648] HybridClock initialized: now 1777514522811050 us; error 82 us; skew 500 ppm
I20260430 02:02:02.813870 26074 webserver.cc:492] Webserver started at http://127.24.153.65:38833/ using document root <none> and password file <none>
I20260430 02:02:02.814790 26074 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:02.814896 26074 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:02.824013 26074 fs_manager.cc:714] Time spent opening directory manager: real 0.007s user 0.004s sys 0.003s
W20260430 02:02:02.874373 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20260430 02:02:02.875591 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 66: this message will repeat every 5th retry.
I20260430 02:02:02.876031 26094 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:02.878334 26074 fs_manager.cc:730] Time spent opening block manager: real 0.031s user 0.001s sys 0.002s
I20260430 02:02:02.878520 26074 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal
uuid: "e481cdb11e4040f2b78a224a2bed5eaa"
format_stamp: "Formatted at 2026-04-30 02:01:54 on dist-test-slave-f7mg"
I20260430 02:02:02.879320 26074 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260430 02:02:02.881589 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20260430 02:02:02.906641 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 66: this message will repeat every 5th retry.
I20260430 02:02:02.919575 26074 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:02.920464 26074 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:02.920697 26074 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:02.921413 26074 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:02.935715 26103 ts_tablet_manager.cc:542] Loading tablet metadata (0/6 complete)
I20260430 02:02:02.949565 26074 ts_tablet_manager.cc:585] Loaded tablet metadata (6 total tablets, 6 live tablets)
I20260430 02:02:02.949659 26074 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.022s user 0.000s sys 0.000s
I20260430 02:02:02.949739 26074 ts_tablet_manager.cc:600] Registering tablets (0/6 complete)
I20260430 02:02:02.957770 26074 ts_tablet_manager.cc:616] Registered 6 tablets
I20260430 02:02:02.957962 26074 ts_tablet_manager.cc:595] Time spent register tablets: real 0.008s user 0.006s sys 0.000s
I20260430 02:02:02.962638 26103 tablet_bootstrap.cc:492] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
I20260430 02:02:02.996129 26074 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.65:44971
I20260430 02:02:03.002380 26074 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
I20260430 02:02:03.011973 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 26074
I20260430 02:02:03.004858 26210 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.65:44971 every 8 connection(s)
I20260430 02:02:03.015432 26103 log.cc:826] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:03.099992 26211 heartbeater.cc:344] Connected to a master server at 127.24.153.126:40723
I20260430 02:02:03.100335 26211 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:03.101805 26211 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:02:03.105377 26003 ts_manager.cc:194] Registered new tserver with Master: e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971)
I20260430 02:02:03.107856 26003 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.65:48055
W20260430 02:02:03.108166 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 71: this message will repeat every 5th retry.
W20260430 02:02:03.240139 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 71: this message will repeat every 5th retry.
I20260430 02:02:03.291527 26103 tablet_bootstrap.cc:492] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap replayed 1/1 log segments. Stats: ops{read=70 overwritten=0 applied=70 ignored=0} inserts{seen=594 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260430 02:02:03.301741 26103 tablet_bootstrap.cc:492] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap complete.
I20260430 02:02:03.303228 26103 ts_tablet_manager.cc:1403] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.341s user 0.090s sys 0.021s
I20260430 02:02:03.326495 26103 raft_consensus.cc:359] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:02:03.327037 26103 raft_consensus.cc:740] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:02:03.327630 26103 consensus_queue.cc:260] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 70, Last appended: 1.70, Last appended by leader: 70, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:02:03.328660 26103 ts_tablet_manager.cc:1434] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.025s user 0.001s sys 0.004s
I20260430 02:02:03.329061 26211 heartbeater.cc:499] Master 127.24.153.126:40723 was elected leader, sending a full tablet report...
I20260430 02:02:03.329103 26103 tablet_bootstrap.cc:492] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
I20260430 02:02:03.367918 26164 raft_consensus.cc:3060] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Advancing to term 2
W20260430 02:02:03.408871 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 71: this message will repeat every 5th retry.
W20260430 02:02:03.435552 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 71: this message will repeat every 5th retry.
W20260430 02:02:03.465860 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 66: this message will repeat every 5th retry.
W20260430 02:02:03.530593 26212 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20260430 02:02:03.671725 26219 raft_consensus.cc:493] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 2 FOLLOWER]: Starting pre-election (detected failure of leader a4ec6dffeb11435b8655672771cd29c4)
I20260430 02:02:03.671988 26219 raft_consensus.cc:515] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } }
I20260430 02:02:03.673213 26219 leader_election.cc:290] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223), f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:02:03.701256 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "29e24069ebd04300a2c4cf4d4bdc5e66" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 3 candidate_status { last_received { term: 2 index: 677 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
W20260430 02:02:03.716130 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 76: this message will repeat every 5th retry.
I20260430 02:02:03.716930 25501 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "29e24069ebd04300a2c4cf4d4bdc5e66" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 3 candidate_status { last_received { term: 2 index: 677 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:02:03.717893 26100 leader_election.cc:304] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e481cdb11e4040f2b78a224a2bed5eaa; no voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5
I20260430 02:02:03.718153 26219 raft_consensus.cc:2749] T 29e24069ebd04300a2c4cf4d4bdc5e66 P e481cdb11e4040f2b78a224a2bed5eaa [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20260430 02:02:03.755821 26103 tablet_bootstrap.cc:492] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap replayed 1/1 log segments. Stats: ops{read=70 overwritten=0 applied=69 ignored=0} inserts{seen=574 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260430 02:02:03.757534 26103 tablet_bootstrap.cc:492] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap complete.
I20260430 02:02:03.760934 26103 ts_tablet_manager.cc:1403] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.432s user 0.072s sys 0.012s
I20260430 02:02:03.771111 26103 raft_consensus.cc:359] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:03.772119 26103 raft_consensus.cc:740] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:02:03.772518 26103 consensus_queue.cc:260] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 69, Last appended: 1.70, Last appended by leader: 70, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:03.776257 26103 ts_tablet_manager.cc:1434] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.006s user 0.002s sys 0.000s
I20260430 02:02:03.780261 26103 tablet_bootstrap.cc:492] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
I20260430 02:02:03.882530 26165 raft_consensus.cc:1217] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Deduplicated request from leader. Original: 1.69->[1.70-1.703] Dedup: 1.70->[1.71-1.703]
W20260430 02:02:03.977120 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 76: this message will repeat every 5th retry.
I20260430 02:02:04.006718 26103 tablet_bootstrap.cc:492] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap replayed 1/1 log segments. Stats: ops{read=69 overwritten=0 applied=69 ignored=0} inserts{seen=566 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260430 02:02:04.012198 26103 tablet_bootstrap.cc:492] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap complete.
I20260430 02:02:04.013478 26103 ts_tablet_manager.cc:1403] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.233s user 0.081s sys 0.007s
I20260430 02:02:04.014067 26103 raft_consensus.cc:359] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:04.014245 26103 raft_consensus.cc:740] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:02:04.014437 26103 consensus_queue.cc:260] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 69, Last appended: 1.69, Last appended by leader: 69, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:04.015167 26103 ts_tablet_manager.cc:1434] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.002s user 0.001s sys 0.000s
I20260430 02:02:04.015407 26103 tablet_bootstrap.cc:492] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
W20260430 02:02:04.155975 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 76: this message will repeat every 5th retry.
W20260430 02:02:04.201992 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 71: this message will repeat every 5th retry.
W20260430 02:02:04.202172 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 81: this message will repeat every 5th retry.
I20260430 02:02:04.244822 26103 tablet_bootstrap.cc:492] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap replayed 1/1 log segments. Stats: ops{read=68 overwritten=0 applied=68 ignored=0} inserts{seen=536 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260430 02:02:04.245352 26103 tablet_bootstrap.cc:492] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap complete.
I20260430 02:02:04.256345 26103 ts_tablet_manager.cc:1403] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.241s user 0.078s sys 0.009s
I20260430 02:02:04.257332 26103 raft_consensus.cc:359] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:04.257570 26103 raft_consensus.cc:740] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:02:04.258013 26103 consensus_queue.cc:260] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 68, Last appended: 1.68, Last appended by leader: 68, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:04.262441 26103 ts_tablet_manager.cc:1434] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.006s user 0.001s sys 0.000s
I20260430 02:02:04.263301 26103 tablet_bootstrap.cc:492] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
I20260430 02:02:04.301224 26235 mvcc.cc:204] Tried to move back new op lower bound from 7280699487483367424 to 7280699458602889216. Current Snapshot: MvccSnapshot[applied={T|T < 7280699487483367424 or (T in {7280699487483367424})}]
I20260430 02:02:04.327236 26160 raft_consensus.cc:3060] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Advancing to term 2
I20260430 02:02:04.458163 26103 tablet_bootstrap.cc:492] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap replayed 1/1 log segments. Stats: ops{read=70 overwritten=0 applied=69 ignored=0} inserts{seen=543 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260430 02:02:04.458827 26103 tablet_bootstrap.cc:492] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap complete.
I20260430 02:02:04.460057 26103 ts_tablet_manager.cc:1403] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.197s user 0.087s sys 0.008s
I20260430 02:02:04.460645 26103 raft_consensus.cc:359] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:04.461115 26103 raft_consensus.cc:740] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:02:04.461769 26103 consensus_queue.cc:260] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 69, Last appended: 1.70, Last appended by leader: 70, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:04.462237 26103 ts_tablet_manager.cc:1434] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.002s user 0.001s sys 0.001s
I20260430 02:02:04.462546 26103 tablet_bootstrap.cc:492] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap starting.
I20260430 02:02:04.568228 26240 raft_consensus.cc:493] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Starting pre-election (detected failure of leader f5202ea2c8244e849a11073ee5d918c5)
I20260430 02:02:04.568384 26240 raft_consensus.cc:515] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:04.569253 26240 leader_election.cc:290] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:02:04.573371 25631 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "63c4457448ec4b1b8a0741a7560cddfe" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 2 candidate_status { last_received { term: 1 index: 712 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:02:04.583395 25500 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "63c4457448ec4b1b8a0741a7560cddfe" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 2 candidate_status { last_received { term: 1 index: 712 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:02:04.584062 26100 leader_election.cc:304] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e481cdb11e4040f2b78a224a2bed5eaa; no voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5
I20260430 02:02:04.584525 26240 raft_consensus.cc:2749] T 63c4457448ec4b1b8a0741a7560cddfe P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260430 02:02:04.591603 26219 raft_consensus.cc:493] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Starting pre-election (detected failure of leader a4ec6dffeb11435b8655672771cd29c4)
I20260430 02:02:04.591740 26219 raft_consensus.cc:515] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:04.592247 26219 leader_election.cc:290] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:02:04.597059 25631 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "869a29991dbe4537bf1082d1c9ee2ecd" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 2 candidate_status { last_received { term: 1 index: 703 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:02:04.597543 25498 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "869a29991dbe4537bf1082d1c9ee2ecd" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 2 candidate_status { last_received { term: 1 index: 703 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:02:04.598052 26100 leader_election.cc:304] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e481cdb11e4040f2b78a224a2bed5eaa; no voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5
I20260430 02:02:04.598294 26219 raft_consensus.cc:2749] T 869a29991dbe4537bf1082d1c9ee2ecd P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
W20260430 02:02:04.860600 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 76: this message will repeat every 5th retry.
I20260430 02:02:04.924080 26103 tablet_bootstrap.cc:492] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap replayed 1/1 log segments. Stats: ops{read=68 overwritten=0 applied=68 ignored=0} inserts{seen=572 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260430 02:02:04.924985 26103 tablet_bootstrap.cc:492] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa: Bootstrap complete.
I20260430 02:02:04.926504 26103 ts_tablet_manager.cc:1403] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa: Time spent bootstrapping tablet: real 0.464s user 0.065s sys 0.019s
I20260430 02:02:04.927085 26103 raft_consensus.cc:359] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:04.927282 26103 raft_consensus.cc:740] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: e481cdb11e4040f2b78a224a2bed5eaa, State: Initialized, Role: FOLLOWER
I20260430 02:02:04.927495 26103 consensus_queue.cc:260] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 68, Last appended: 1.68, Last appended by leader: 68, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:04.927938 26103 ts_tablet_manager.cc:1434] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa: Time spent starting tablet: real 0.001s user 0.000s sys 0.001s
I20260430 02:02:05.037884 26161 raft_consensus.cc:3060] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Advancing to term 3
I20260430 02:02:05.346621 26219 raft_consensus.cc:493] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 2 FOLLOWER]: Starting pre-election (detected failure of leader f5202ea2c8244e849a11073ee5d918c5)
I20260430 02:02:05.346895 26219 raft_consensus.cc:515] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:05.347538 26219 leader_election.cc:290] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:02:05.348294 25632 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f1dbbbee06674a93a3dd31c45c90d59d" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 3 candidate_status { last_received { term: 2 index: 721 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:02:05.348297 25498 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f1dbbbee06674a93a3dd31c45c90d59d" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 3 candidate_status { last_received { term: 2 index: 721 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:02:05.348796 26099 leader_election.cc:304] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e481cdb11e4040f2b78a224a2bed5eaa; no voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5
I20260430 02:02:05.349042 26219 raft_consensus.cc:2749] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20260430 02:02:05.364125 26240 raft_consensus.cc:493] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Starting pre-election (detected failure of leader a4ec6dffeb11435b8655672771cd29c4)
I20260430 02:02:05.364383 26240 raft_consensus.cc:515] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:05.365235 26240 leader_election.cc:290] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:02:05.365974 25631 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "2c080a5b649f453c903e1dce5ab6a113" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 2 candidate_status { last_received { term: 1 index: 730 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:02:05.370306 25498 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "2c080a5b649f453c903e1dce5ab6a113" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 2 candidate_status { last_received { term: 1 index: 730 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:02:05.371097 26160 rpcz_store.cc:275] Call kudu.consensus.ConsensusService.UpdateConsensus from 127.24.153.66:56009 (request call id 19) took 1110 ms. Trace:
I20260430 02:02:05.381289 26100 leader_election.cc:304] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e481cdb11e4040f2b78a224a2bed5eaa; no voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5
I20260430 02:02:05.381811 26240 raft_consensus.cc:2749] T 2c080a5b649f453c903e1dce5ab6a113 P e481cdb11e4040f2b78a224a2bed5eaa [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260430 02:02:05.658926 26240 raft_consensus.cc:493] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 2 FOLLOWER]: Starting pre-election (detected failure of leader f5202ea2c8244e849a11073ee5d918c5)
I20260430 02:02:05.659055 26240 raft_consensus.cc:515] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:05.659435 26240 leader_election.cc:290] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:02:05.659965 25631 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f1dbbbee06674a93a3dd31c45c90d59d" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 3 candidate_status { last_received { term: 2 index: 721 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:02:05.660491 25498 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f1dbbbee06674a93a3dd31c45c90d59d" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 3 candidate_status { last_received { term: 2 index: 721 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:02:05.662300 26100 leader_election.cc:304] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e481cdb11e4040f2b78a224a2bed5eaa; no voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5
I20260430 02:02:05.670450 26240 raft_consensus.cc:2749] T f1dbbbee06674a93a3dd31c45c90d59d P e481cdb11e4040f2b78a224a2bed5eaa [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20260430 02:02:05.371222 26160 rpcz_store.cc:276] 0430 02:02:04.260418 (+ 0us) service_pool.cc:168] Inserting onto call queue
0430 02:02:04.260722 (+ 304us) service_pool.cc:225] Handling call
0430 02:02:04.283632 (+ 22910us) raft_consensus.cc:1471] Updating replica for 653 ops
0430 02:02:04.339277 (+ 55645us) raft_consensus.cc:1534] Early marking committed up to index 0
0430 02:02:04.339280 (+ 3us) raft_consensus.cc:1539] Triggering prepare for 653 ops
0430 02:02:04.339396 (+ 116us) write_op.cc:270] Start()
0430 02:02:04.339412 (+ 16us) write_op.cc:276] Timestamp: P: 1777514516243283 usec, L: 0
0430 02:02:04.339643 (+ 231us) write_op.cc:270] Start()
0430 02:02:04.339658 (+ 15us) write_op.cc:276] Timestamp: P: 1777514516246779 usec, L: 0
0430 02:02:04.352778 (+ 13120us) write_op.cc:270] Start()
0430 02:02:04.352794 (+ 16us) write_op.cc:276] Timestamp: P: 1777514516271059 usec, L: 0
0430 02:02:04.352875 (+ 81us) write_op.cc:270] Start()
0430 02:02:04.352887 (+ 12us) write_op.cc:276] Timestamp: P: 1777514516444506 usec, L: 0
0430 02:02:04.352963 (+ 76us) write_op.cc:270] Start()
0430 02:02:04.352974 (+ 11us) write_op.cc:276] Timestamp: P: 1777514516585631 usec, L: 0
0430 02:02:04.353043 (+ 69us) write_op.cc:270] Start()
0430 02:02:04.353055 (+ 12us) write_op.cc:276] Timestamp: P: 1777514516593941 usec, L: 0
0430 02:02:04.354460 (+ 1405us) write_op.cc:270] Start()
0430 02:02:04.354501 (+ 41us) write_op.cc:276] Timestamp: P: 1777514516608446 usec, L: 0
0430 02:02:04.354605 (+ 104us) write_op.cc:270] Start()
0430 02:02:04.354618 (+ 13us) write_op.cc:276] Timestamp: P: 1777514516620314 usec, L: 0
0430 02:02:04.354711 (+ 93us) write_op.cc:270] Start()
0430 02:02:04.354725 (+ 14us) write_op.cc:276] Timestamp: P: 1777514516631611 usec, L: 0
0430 02:02:04.354819 (+ 94us) write_op.cc:270] Start()
0430 02:02:04.354833 (+ 14us) write_op.cc:276] Timestamp: P: 1777514516645700 usec, L: 0
0430 02:02:04.354923 (+ 90us) write_op.cc:270] Start()
0430 02:02:04.354936 (+ 13us) write_op.cc:276] Timestamp: P: 1777514516652016 usec, L: 0
0430 02:02:04.355027 (+ 91us) write_op.cc:270] Start()
0430 02:02:04.355041 (+ 14us) write_op.cc:276] Timestamp: P: 1777514516658902 usec, L: 0
0430 02:02:04.355122 (+ 81us) write_op.cc:270] Start()
0430 02:02:04.355135 (+ 13us) write_op.cc:276] Timestamp: P: 1777514516670040 usec, L: 0
0430 02:02:04.355206 (+ 71us) write_op.cc:270] Start()
0430 02:02:04.355218 (+ 12us) write_op.cc:276] Timestamp: P: 1777514516684757 usec, L: 0
0430 02:02:04.358066 (+ 2848us) write_op.cc:270] Start()
0430 02:02:04.358088 (+ 22us) write_op.cc:276] Timestamp: P: 1777514516685793 usec, L: 0
0430 02:02:04.358201 (+ 113us) write_op.cc:270] Start()
0430 02:02:04.358215 (+ 14us) write_op.cc:276] Timestamp: P: 1777514516697419 usec, L: 0
0430 02:02:04.358309 (+ 94us) write_op.cc:270] Start()
0430 02:02:04.358328 (+ 19us) write_op.cc:276] Timestamp: P: 1777514516716094 usec, L: 0
0430 02:02:04.358586 (+ 258us) write_op.cc:270] Start()
0430 02:02:04.358608 (+ 22us) write_op.cc:276] Timestamp: P: 1777514516722250 usec, L: 0
0430 02:02:04.358748 (+ 140us) write_op.cc:270] Start()
0430 02:02:04.358764 (+ 16us) write_op.cc:276] Timestamp: P: 1777514516740157 usec, L: 0
0430 02:02:04.358873 (+ 109us) write_op.cc:270] Start()
0430 02:02:04.358887 (+ 14us) write_op.cc:276] Timestamp: P: 1777514516744340 usec, L: 0
0430 02:02:04.359008 (+ 121us) write_op.cc:270] Start()
0430 02:02:04.359021 (+ 13us) write_op.cc:276] Timestamp: P: 1777514516756938 usec, L: 0
0430 02:02:04.359106 (+ 85us) write_op.cc:270] Start()
0430 02:02:04.359141 (+ 35us) write_op.cc:276] Timestamp: P: 1777514516765368 usec, L: 0
0430 02:02:04.359244 (+ 103us) write_op.cc:270] Start()
0430 02:02:04.359258 (+ 14us) write_op.cc:276] Timestamp: P: 1777514516769310 usec, L: 0
0430 02:02:04.359344 (+ 86us) write_op.cc:270] Start()
0430 02:02:04.359356 (+ 12us) write_op.cc:276] Timestamp: P: 1777514516791710 usec, L: 0
0430 02:02:04.359464 (+ 108us) write_op.cc:270] Start()
0430 02:02:04.359478 (+ 14us) write_op.cc:276] Timestamp: P: 1777514516802641 usec, L: 0
0430 02:02:04.359563 (+ 85us) write_op.cc:270] Start()
0430 02:02:04.359576 (+ 13us) write_op.cc:276] Timestamp: P: 1777514516806388 usec, L: 0
0430 02:02:04.359671 (+ 95us) write_op.cc:270] Start()
0430 02:02:04.359684 (+ 13us) write_op.cc:276] Timestamp: P: 1777514516821745 usec, L: 0
0430 02:02:04.359771 (+ 87us) write_op.cc:270] Start()
0430 02:02:04.359783 (+ 12us) write_op.cc:276] Timestamp: P: 1777514516822867 usec, L: 0
0430 02:02:04.359879 (+ 96us) write_op.cc:270] Start()
0430 02:02:04.359892 (+ 13us) write_op.cc:276] Timestamp: P: 1777514516840603 usec, L: 0
0430 02:02:04.359979 (+ 87us) write_op.cc:270] Start()
0430 02:02:04.359992 (+ 13us) write_op.cc:276] Timestamp: P: 1777514516846268 usec, L: 0
0430 02:02:04.360121 (+ 129us) write_op.cc:270] Start()
0430 02:02:04.360138 (+ 17us) write_op.cc:276] Timestamp: P: 1777514516860123 usec, L: 0
0430 02:02:04.360226 (+ 88us) write_op.cc:270] Start()
0430 02:02:04.360240 (+ 14us) write_op.cc:276] Timestamp: P: 1777514516865527 usec, L: 0
0430 02:02:04.360350 (+ 110us) write_op.cc:270] Start()
0430 02:02:04.360365 (+ 15us) write_op.cc:276] Timestamp: P: 1777514516866488 usec, L: 0
0430 02:02:04.360468 (+ 103us) write_op.cc:270] Start()
0430 02:02:04.360485 (+ 17us) write_op.cc:276] Timestamp: P: 1777514516878344 usec, L: 0
0430 02:02:04.360606 (+ 121us) write_op.cc:270] Start()
0430 02:02:04.360621 (+ 15us) write_op.cc:276] Timestamp: P: 1777514516898076 usec, L: 0
0430 02:02:04.360715 (+ 94us) write_op.cc:270] Start()
0430 02:02:04.360727 (+ 12us) write_op.cc:276] Timestamp: P: 1777514516906712 usec, L: 0
0430 02:02:04.360824 (+ 97us) write_op.cc:270] Start()
0430 02:02:04.360857 (+ 33us) write_op.cc:276] Timestamp: P: 1777514516911312 usec, L: 0
0430 02:02:04.360951 (+ 94us) write_op.cc:270] Start()
0430 02:02:04.360964 (+ 13us) write_op.cc:276] Timestamp: P: 1777514516914186 usec, L: 0
0430 02:02:04.361067 (+ 103us) write_op.cc:270] Start()
0430 02:02:04.361084 (+ 17us) write_op.cc:276] Timestamp: P: 1777514516933959 usec, L: 0
0430 02:02:04.361193 (+ 109us) write_op.cc:270] Start()
0430 02:02:04.361209 (+ 16us) write_op.cc:276] Timestamp: P: 1777514516937005 usec, L: 0
0430 02:02:04.361313 (+ 104us) write_op.cc:270] Start()
0430 02:02:04.361329 (+ 16us) write_op.cc:276] Timestamp: P: 1777514516955930 usec, L: 0
0430 02:02:04.366044 (+ 4715us) write_op.cc:270] Start()
0430 02:02:04.366061 (+ 17us) write_op.cc:276] Timestamp: P: 1777514516958885 usec, L: 0
0430 02:02:04.386807 (+ 20746us) write_op.cc:270] Start()
0430 02:02:04.386835 (+ 28us) write_op.cc:276] Timestamp: P: 1777514516980010 usec, L: 0
0430 02:02:04.386979 (+ 144us) write_op.cc:270] Start()
0430 02:02:04.387001 (+ 22us) write_op.cc:276] Timestamp: P: 1777514516980496 usec, L: 0
0430 02:02:04.387125 (+ 124us) write_op.cc:270] Start()
0430 02:02:04.387146 (+ 21us) write_op.cc:276] Timestamp: P: 1777514516981261 usec, L: 0
0430 02:02:04.387262 (+ 116us) write_op.cc:270] Start()
0430 02:02:04.387277 (+ 15us) write_op.cc:276] Timestamp: P: 1777514516992650 usec, L: 0
0430 02:02:04.387380 (+ 103us) write_op.cc:270] Start()
0430 02:02:04.387394 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517004910 usec, L: 0
0430 02:02:04.387491 (+ 97us) write_op.cc:270] Start()
0430 02:02:04.387504 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517005760 usec, L: 0
0430 02:02:04.387596 (+ 92us) write_op.cc:270] Start()
0430 02:02:04.387609 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517016345 usec, L: 0
0430 02:02:04.387685 (+ 76us) write_op.cc:270] Start()
0430 02:02:04.387696 (+ 11us) write_op.cc:276] Timestamp: P: 1777514517030600 usec, L: 0
0430 02:02:04.392195 (+ 4499us) write_op.cc:270] Start()
0430 02:02:04.392214 (+ 19us) write_op.cc:276] Timestamp: P: 1777514517042498 usec, L: 0
0430 02:02:04.394552 (+ 2338us) write_op.cc:270] Start()
0430 02:02:04.394583 (+ 31us) write_op.cc:276] Timestamp: P: 1777514517044897 usec, L: 0
0430 02:02:04.394884 (+ 301us) write_op.cc:270] Start()
0430 02:02:04.394905 (+ 21us) write_op.cc:276] Timestamp: P: 1777514517059839 usec, L: 0
0430 02:02:04.394995 (+ 90us) write_op.cc:270] Start()
0430 02:02:04.395007 (+ 12us) write_op.cc:276] Timestamp: P: 1777514517066497 usec, L: 0
0430 02:02:04.395090 (+ 83us) write_op.cc:270] Start()
0430 02:02:04.395102 (+ 12us) write_op.cc:276] Timestamp: P: 1777514517081370 usec, L: 0
0430 02:02:04.395181 (+ 79us) write_op.cc:270] Start()
0430 02:02:04.395194 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517096160 usec, L: 0
0430 02:02:04.395270 (+ 76us) write_op.cc:270] Start()
0430 02:02:04.395297 (+ 27us) write_op.cc:276] Timestamp: P: 1777514517100674 usec, L: 0
0430 02:02:04.395449 (+ 152us) write_op.cc:270] Start()
0430 02:02:04.395468 (+ 19us) write_op.cc:276] Timestamp: P: 1777514517120339 usec, L: 0
0430 02:02:04.395584 (+ 116us) write_op.cc:270] Start()
0430 02:02:04.395598 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517120813 usec, L: 0
0430 02:02:04.395725 (+ 127us) write_op.cc:270] Start()
0430 02:02:04.395745 (+ 20us) write_op.cc:276] Timestamp: P: 1777514517128784 usec, L: 0
0430 02:02:04.395849 (+ 104us) write_op.cc:270] Start()
0430 02:02:04.395867 (+ 18us) write_op.cc:276] Timestamp: P: 1777514517140874 usec, L: 0
0430 02:02:04.395963 (+ 96us) write_op.cc:270] Start()
0430 02:02:04.395977 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517155131 usec, L: 0
0430 02:02:04.396059 (+ 82us) write_op.cc:270] Start()
0430 02:02:04.396071 (+ 12us) write_op.cc:276] Timestamp: P: 1777514517159235 usec, L: 0
0430 02:02:04.396183 (+ 112us) write_op.cc:270] Start()
0430 02:02:04.396197 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517161452 usec, L: 0
0430 02:02:04.396310 (+ 113us) write_op.cc:270] Start()
0430 02:02:04.396323 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517172113 usec, L: 0
0430 02:02:04.398997 (+ 2674us) write_op.cc:270] Start()
0430 02:02:04.399017 (+ 20us) write_op.cc:276] Timestamp: P: 1777514517183990 usec, L: 0
0430 02:02:04.399576 (+ 559us) write_op.cc:270] Start()
0430 02:02:04.399593 (+ 17us) write_op.cc:276] Timestamp: P: 1777514517195251 usec, L: 0
0430 02:02:04.399690 (+ 97us) write_op.cc:270] Start()
0430 02:02:04.399704 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517195946 usec, L: 0
0430 02:02:04.399789 (+ 85us) write_op.cc:270] Start()
0430 02:02:04.399804 (+ 15us) write_op.cc:276] Timestamp: P: 1777514517206474 usec, L: 0
0430 02:02:04.399889 (+ 85us) write_op.cc:270] Start()
0430 02:02:04.399902 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517218747 usec, L: 0
0430 02:02:04.399998 (+ 96us) write_op.cc:270] Start()
0430 02:02:04.400012 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517229233 usec, L: 0
0430 02:02:04.400091 (+ 79us) write_op.cc:270] Start()
0430 02:02:04.400352 (+ 261us) write_op.cc:276] Timestamp: P: 1777514517242632 usec, L: 0
0430 02:02:04.400848 (+ 496us) write_op.cc:270] Start()
0430 02:02:04.400882 (+ 34us) write_op.cc:276] Timestamp: P: 1777514517263350 usec, L: 0
0430 02:02:04.401170 (+ 288us) write_op.cc:270] Start()
0430 02:02:04.401193 (+ 23us) write_op.cc:276] Timestamp: P: 1777514517268270 usec, L: 0
0430 02:02:04.401307 (+ 114us) write_op.cc:270] Start()
0430 02:02:04.401325 (+ 18us) write_op.cc:276] Timestamp: P: 1777514517275061 usec, L: 0
0430 02:02:04.401830 (+ 505us) write_op.cc:270] Start()
0430 02:02:04.401848 (+ 18us) write_op.cc:276] Timestamp: P: 1777514517277660 usec, L: 0
0430 02:02:04.401969 (+ 121us) write_op.cc:270] Start()
0430 02:02:04.401983 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517298175 usec, L: 0
0430 02:02:04.402066 (+ 83us) write_op.cc:270] Start()
0430 02:02:04.402078 (+ 12us) write_op.cc:276] Timestamp: P: 1777514517308749 usec, L: 0
0430 02:02:04.402156 (+ 78us) write_op.cc:270] Start()
0430 02:02:04.417339 (+ 15183us) write_op.cc:276] Timestamp: P: 1777514517313711 usec, L: 0
0430 02:02:04.417844 (+ 505us) write_op.cc:270] Start()
0430 02:02:04.417878 (+ 34us) write_op.cc:276] Timestamp: P: 1777514517320262 usec, L: 0
0430 02:02:04.418060 (+ 182us) write_op.cc:270] Start()
0430 02:02:04.418075 (+ 15us) write_op.cc:276] Timestamp: P: 1777514517326621 usec, L: 0
0430 02:02:04.418161 (+ 86us) write_op.cc:270] Start()
0430 02:02:04.418174 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517358087 usec, L: 0
0430 02:02:04.418263 (+ 89us) write_op.cc:270] Start()
0430 02:02:04.418275 (+ 12us) write_op.cc:276] Timestamp: P: 1777514517364511 usec, L: 0
0430 02:02:04.418678 (+ 403us) write_op.cc:270] Start()
0430 02:02:04.418708 (+ 30us) write_op.cc:276] Timestamp: P: 1777514517375532 usec, L: 0
0430 02:02:04.419098 (+ 390us) write_op.cc:270] Start()
0430 02:02:04.419121 (+ 23us) write_op.cc:276] Timestamp: P: 1777514517376440 usec, L: 0
0430 02:02:04.419260 (+ 139us) write_op.cc:270] Start()
0430 02:02:04.419274 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517394157 usec, L: 0
0430 02:02:04.423231 (+ 3957us) write_op.cc:270] Start()
0430 02:02:04.423250 (+ 19us) write_op.cc:276] Timestamp: P: 1777514517398734 usec, L: 0
0430 02:02:04.423365 (+ 115us) write_op.cc:270] Start()
0430 02:02:04.423383 (+ 18us) write_op.cc:276] Timestamp: P: 1777514517413690 usec, L: 0
0430 02:02:04.423765 (+ 382us) write_op.cc:270] Start()
0430 02:02:04.423788 (+ 23us) write_op.cc:276] Timestamp: P: 1777514517418699 usec, L: 0
0430 02:02:04.423912 (+ 124us) write_op.cc:270] Start()
0430 02:02:04.423927 (+ 15us) write_op.cc:276] Timestamp: P: 1777514517424347 usec, L: 0
0430 02:02:04.424108 (+ 181us) write_op.cc:270] Start()
0430 02:02:04.424124 (+ 16us) write_op.cc:276] Timestamp: P: 1777514517440028 usec, L: 0
0430 02:02:04.424213 (+ 89us) write_op.cc:270] Start()
0430 02:02:04.424226 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517464690 usec, L: 0
0430 02:02:04.424320 (+ 94us) write_op.cc:270] Start()
0430 02:02:04.424338 (+ 18us) write_op.cc:276] Timestamp: P: 1777514517470394 usec, L: 0
0430 02:02:04.424435 (+ 97us) write_op.cc:270] Start()
0430 02:02:04.424467 (+ 32us) write_op.cc:276] Timestamp: P: 1777514517480498 usec, L: 0
0430 02:02:04.424588 (+ 121us) write_op.cc:270] Start()
0430 02:02:04.424606 (+ 18us) write_op.cc:276] Timestamp: P: 1777514517501073 usec, L: 0
0430 02:02:04.424714 (+ 108us) write_op.cc:270] Start()
0430 02:02:04.424729 (+ 15us) write_op.cc:276] Timestamp: P: 1777514517520845 usec, L: 0
0430 02:02:04.424856 (+ 127us) write_op.cc:270] Start()
0430 02:02:04.424874 (+ 18us) write_op.cc:276] Timestamp: P: 1777514517526060 usec, L: 0
0430 02:02:04.424997 (+ 123us) write_op.cc:270] Start()
0430 02:02:04.425013 (+ 16us) write_op.cc:276] Timestamp: P: 1777514517545380 usec, L: 0
0430 02:02:04.425117 (+ 104us) write_op.cc:270] Start()
0430 02:02:04.425130 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517551343 usec, L: 0
0430 02:02:04.425211 (+ 81us) write_op.cc:270] Start()
0430 02:02:04.425223 (+ 12us) write_op.cc:276] Timestamp: P: 1777514517564476 usec, L: 0
0430 02:02:04.425302 (+ 79us) write_op.cc:270] Start()
0430 02:02:04.425317 (+ 15us) write_op.cc:276] Timestamp: P: 1777514517589690 usec, L: 0
0430 02:02:04.425402 (+ 85us) write_op.cc:270] Start()
0430 02:02:04.425416 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517594327 usec, L: 0
0430 02:02:04.425814 (+ 398us) write_op.cc:270] Start()
0430 02:02:04.425835 (+ 21us) write_op.cc:276] Timestamp: P: 1777514517597770 usec, L: 0
0430 02:02:04.425978 (+ 143us) write_op.cc:270] Start()
0430 02:02:04.425995 (+ 17us) write_op.cc:276] Timestamp: P: 1777514517599913 usec, L: 0
0430 02:02:04.426100 (+ 105us) write_op.cc:270] Start()
0430 02:02:04.426116 (+ 16us) write_op.cc:276] Timestamp: P: 1777514517617219 usec, L: 0
0430 02:02:04.426224 (+ 108us) write_op.cc:270] Start()
0430 02:02:04.426239 (+ 15us) write_op.cc:276] Timestamp: P: 1777514517636788 usec, L: 0
0430 02:02:04.436366 (+ 10127us) write_op.cc:270] Start()
0430 02:02:04.436695 (+ 329us) write_op.cc:276] Timestamp: P: 1777514517640088 usec, L: 0
0430 02:02:04.436874 (+ 179us) write_op.cc:270] Start()
0430 02:02:04.436893 (+ 19us) write_op.cc:276] Timestamp: P: 1777514517647018 usec, L: 0
0430 02:02:04.437005 (+ 112us) write_op.cc:270] Start()
0430 02:02:04.437021 (+ 16us) write_op.cc:276] Timestamp: P: 1777514517651592 usec, L: 0
0430 02:02:04.437113 (+ 92us) write_op.cc:270] Start()
0430 02:02:04.437126 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517671410 usec, L: 0
0430 02:02:04.437206 (+ 80us) write_op.cc:270] Start()
0430 02:02:04.437220 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517672787 usec, L: 0
0430 02:02:04.452256 (+ 15036us) write_op.cc:270] Start()
0430 02:02:04.452277 (+ 21us) write_op.cc:276] Timestamp: P: 1777514517687681 usec, L: 0
0430 02:02:04.452400 (+ 123us) write_op.cc:270] Start()
0430 02:02:04.452421 (+ 21us) write_op.cc:276] Timestamp: P: 1777514517690445 usec, L: 0
0430 02:02:04.452520 (+ 99us) write_op.cc:270] Start()
0430 02:02:04.452533 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517721849 usec, L: 0
0430 02:02:04.452630 (+ 97us) write_op.cc:270] Start()
0430 02:02:04.452645 (+ 15us) write_op.cc:276] Timestamp: P: 1777514517723585 usec, L: 0
0430 02:02:04.452733 (+ 88us) write_op.cc:270] Start()
0430 02:02:04.452747 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517724066 usec, L: 0
0430 02:02:04.452837 (+ 90us) write_op.cc:270] Start()
0430 02:02:04.452853 (+ 16us) write_op.cc:276] Timestamp: P: 1777514517731180 usec, L: 0
0430 02:02:04.452946 (+ 93us) write_op.cc:270] Start()
0430 02:02:04.452959 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517763686 usec, L: 0
0430 02:02:04.453047 (+ 88us) write_op.cc:270] Start()
0430 02:02:04.453060 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517774478 usec, L: 0
0430 02:02:04.453138 (+ 78us) write_op.cc:270] Start()
0430 02:02:04.453150 (+ 12us) write_op.cc:276] Timestamp: P: 1777514517776674 usec, L: 0
0430 02:02:04.453260 (+ 110us) write_op.cc:270] Start()
0430 02:02:04.453273 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517784535 usec, L: 0
0430 02:02:04.455397 (+ 2124us) write_op.cc:270] Start()
0430 02:02:04.455417 (+ 20us) write_op.cc:276] Timestamp: P: 1777514517800152 usec, L: 0
0430 02:02:04.455526 (+ 109us) write_op.cc:270] Start()
0430 02:02:04.455539 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517809542 usec, L: 0
0430 02:02:04.455621 (+ 82us) write_op.cc:270] Start()
0430 02:02:04.455633 (+ 12us) write_op.cc:276] Timestamp: P: 1777514517811139 usec, L: 0
0430 02:02:04.455717 (+ 84us) write_op.cc:270] Start()
0430 02:02:04.455730 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517833130 usec, L: 0
0430 02:02:04.455809 (+ 79us) write_op.cc:270] Start()
0430 02:02:04.455822 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517843554 usec, L: 0
0430 02:02:04.455909 (+ 87us) write_op.cc:270] Start()
0430 02:02:04.455927 (+ 18us) write_op.cc:276] Timestamp: P: 1777514517856922 usec, L: 0
0430 02:02:04.456016 (+ 89us) write_op.cc:270] Start()
0430 02:02:04.456031 (+ 15us) write_op.cc:276] Timestamp: P: 1777514517857810 usec, L: 0
0430 02:02:04.456158 (+ 127us) write_op.cc:270] Start()
0430 02:02:04.456172 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517873321 usec, L: 0
0430 02:02:04.456262 (+ 90us) write_op.cc:270] Start()
0430 02:02:04.456275 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517874479 usec, L: 0
0430 02:02:04.456363 (+ 88us) write_op.cc:270] Start()
0430 02:02:04.456406 (+ 43us) write_op.cc:276] Timestamp: P: 1777514517881292 usec, L: 0
0430 02:02:04.456528 (+ 122us) write_op.cc:270] Start()
0430 02:02:04.456546 (+ 18us) write_op.cc:276] Timestamp: P: 1777514517885856 usec, L: 0
0430 02:02:04.456657 (+ 111us) write_op.cc:270] Start()
0430 02:02:04.456673 (+ 16us) write_op.cc:276] Timestamp: P: 1777514517898032 usec, L: 0
0430 02:02:04.456772 (+ 99us) write_op.cc:270] Start()
0430 02:02:04.456787 (+ 15us) write_op.cc:276] Timestamp: P: 1777514517906659 usec, L: 0
0430 02:02:04.456889 (+ 102us) write_op.cc:270] Start()
0430 02:02:04.456903 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517915186 usec, L: 0
0430 02:02:04.456987 (+ 84us) write_op.cc:270] Start()
0430 02:02:04.457000 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517945454 usec, L: 0
0430 02:02:04.457106 (+ 106us) write_op.cc:270] Start()
0430 02:02:04.457120 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517947210 usec, L: 0
0430 02:02:04.457224 (+ 104us) write_op.cc:270] Start()
0430 02:02:04.457238 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517947766 usec, L: 0
0430 02:02:04.457341 (+ 103us) write_op.cc:270] Start()
0430 02:02:04.457368 (+ 27us) write_op.cc:276] Timestamp: P: 1777514517960136 usec, L: 0
0430 02:02:04.457480 (+ 112us) write_op.cc:270] Start()
0430 02:02:04.457496 (+ 16us) write_op.cc:276] Timestamp: P: 1777514517980804 usec, L: 0
0430 02:02:04.457600 (+ 104us) write_op.cc:270] Start()
0430 02:02:04.457613 (+ 13us) write_op.cc:276] Timestamp: P: 1777514517982246 usec, L: 0
0430 02:02:04.457711 (+ 98us) write_op.cc:270] Start()
0430 02:02:04.457725 (+ 14us) write_op.cc:276] Timestamp: P: 1777514517982815 usec, L: 0
0430 02:02:04.457834 (+ 109us) write_op.cc:270] Start()
0430 02:02:04.457850 (+ 16us) write_op.cc:276] Timestamp: P: 1777514518006218 usec, L: 0
0430 02:02:04.457967 (+ 117us) write_op.cc:270] Start()
0430 02:02:04.457985 (+ 18us) write_op.cc:276] Timestamp: P: 1777514518008060 usec, L: 0
0430 02:02:04.458104 (+ 119us) write_op.cc:270] Start()
0430 02:02:04.458122 (+ 18us) write_op.cc:276] Timestamp: P: 1777514518016998 usec, L: 0
0430 02:02:04.458216 (+ 94us) write_op.cc:270] Start()
0430 02:02:04.458230 (+ 14us) write_op.cc:276] Timestamp: P: 1777514518022551 usec, L: 0
0430 02:02:04.458311 (+ 81us) write_op.cc:270] Start()
0430 02:02:04.458325 (+ 14us) write_op.cc:276] Timestamp: P: 1777514518053228 usec, L: 0
0430 02:02:04.458430 (+ 105us) write_op.cc:270] Start()
0430 02:02:04.458447 (+ 17us) write_op.cc:276] Timestamp: P: 1777514518061825 usec, L: 0
0430 02:02:04.458571 (+ 124us) write_op.cc:270] Start()
0430 02:02:04.458586 (+ 15us) write_op.cc:276] Timestamp: P: 1777514518063325 usec, L: 0
0430 02:02:04.461645 (+ 3059us) write_op.cc:270] Start()
0430 02:02:04.461665 (+ 20us) write_op.cc:276] Timestamp: P: 1777514518063819 usec, L: 0
0430 02:02:04.461784 (+ 119us) write_op.cc:270] Start()
0430 02:02:04.461799 (+ 15us) write_op.cc:276] Timestamp: P: 1777514518094408 usec, L: 0
0430 02:02:04.461884 (+ 85us) write_op.cc:270] Start()
0430 02:02:04.461897 (+ 13us) write_op.cc:276] Timestamp: P: 1777514518098506 usec, L: 0
0430 02:02:04.462032 (+ 135us) write_op.cc:270] Start()
0430 02:02:04.462048 (+ 16us) write_op.cc:276] Timestamp: P: 1777514518106337 usec, L: 0
0430 02:02:04.462141 (+ 93us) write_op.cc:270] Start()
0430 02:02:04.462154 (+ 13us) write_op.cc:276] Timestamp: P: 1777514518107968 usec, L: 0
0430 02:02:04.462237 (+ 83us) write_op.cc:270] Start()
0430 02:02:04.462252 (+ 15us) write_op.cc:276] Timestamp: P: 1777514518128276 usec, L: 0
0430 02:02:04.462332 (+ 80us) write_op.cc:270] Start()
0430 02:02:04.462344 (+ 12us) write_op.cc:276] Timestamp: P: 1777514518139746 usec, L: 0
0430 02:02:04.462454 (+ 110us) write_op.cc:270] Start()
0430 02:02:04.462468 (+ 14us) write_op.cc:276] Timestamp: P: 1777514518148556 usec, L: 0
0430 02:02:04.462553 (+ 85us) write_op.cc:270] Start()
0430 02:02:04.462569 (+ 16us) write_op.cc:276] Timestamp: P: 1777514518149872 usec, L: 0
0430 02:02:04.462657 (+ 88us) write_op.cc:270] Start()
0430 02:02:04.462674 (+ 17us) write_op.cc:276] Timestamp: P: 1777514518151382 usec, L: 0
0430 02:02:04.513871 (+ 51197us) write_op.cc:270] Start()
0430 02:02:04.513885 (+ 14us) write_op.cc:276] Timestamp: P: 1777514518173928 usec, L: 0
0430 02:02:04.514193 (+ 308us) write_op.cc:270] Start()
0430 02:02:04.514208 (+ 15us) write_op.cc:276] Timestamp: P: 1777514518187115 usec, L: 0
0430 02:02:04.514691 (+ 483us) write_op.cc:270] Start()
0430 02:02:04.514715 (+ 24us) write_op.cc:276] Timestamp: P: 1777514518201510 usec, L: 0
0430 02:02:04.516228 (+ 1513us) write_op.cc:270] Start()
0430 02:02:04.516244 (+ 16us) write_op.cc:276] Timestamp: P: 1777514518216159 usec, L: 0
0430 02:02:04.526178 (+ 9934us) write_op.cc:270] Start()
0430 02:02:04.526198 (+ 20us) write_op.cc:276] Timestamp: P: 1777514518218413 usec, L: 0
0430 02:02:04.530862 (+ 4664us) write_op.cc:270] Start()
0430 02:02:04.530889 (+ 27us) write_op.cc:276] Timestamp: P: 1777514518218851 usec, L: 0
0430 02:02:04.531058 (+ 169us) write_op.cc:270] Start()
0430 02:02:04.531075 (+ 17us) write_op.cc:276] Timestamp: P: 1777514518237500 usec, L: 0
0430 02:02:04.531214 (+ 139us) write_op.cc:270] Start()
0430 02:02:04.531275 (+ 61us) write_op.cc:276] Timestamp: P: 1777514518253398 usec, L: 0
0430 02:02:04.531440 (+ 165us) write_op.cc:270] Start()
0430 02:02:04.531455 (+ 15us) write_op.cc:276] Timestamp: P: 1777514518262666 usec, L: 0
0430 02:02:04.531598 (+ 143us) write_op.cc:270] Start()
0430 02:02:04.531612 (+ 14us) write_op.cc:276] Timestamp: P: 1777514518263239 usec, L: 0
0430 02:02:04.532078 (+ 466us) write_op.cc:270] Start()
0430 02:02:04.532095 (+ 17us) write_op.cc:276] Timestamp: P: 1777514518280099 usec, L: 0
0430 02:02:04.532229 (+ 134us) write_op.cc:270] Start()
0430 02:02:04.532243 (+ 14us) write_op.cc:276] Timestamp: P: 1777514518285099 usec, L: 0
0430 02:02:04.532338 (+ 95us) write_op.cc:270] Start()
0430 02:02:04.532350 (+ 12us) write_op.cc:276] Timestamp: P: 1777514518311038 usec, L: 0
0430 02:02:04.532430 (+ 80us) write_op.cc:270] Start()
0430 02:02:04.532445 (+ 15us) write_op.cc:276] Timestamp: P: 1777514518311607 usec, L: 0
0430 02:02:04.532528 (+ 83us) write_op.cc:270] Start()
0430 02:02:04.532541 (+ 13us) write_op.cc:276] Timestamp: P: 1777514518312903 usec, L: 0
0430 02:02:04.532622 (+ 81us) write_op.cc:270] Start()
0430 02:02:04.532652 (+ 30us) write_op.cc:276] Timestamp: P: 1777514518332351 usec, L: 0
0430 02:02:04.532738 (+ 86us) write_op.cc:270] Start()
0430 02:02:04.532751 (+ 13us) write_op.cc:276] Timestamp: P: 1777514518344147 usec, L: 0
0430 02:02:04.535083 (+ 2332us) write_op.cc:270] Start()
0430 02:02:04.535102 (+ 19us) write_op.cc:276] Timestamp: P: 1777514518345657 usec, L: 0
0430 02:02:04.535297 (+ 195us) write_op.cc:270] Start()
0430 02:02:04.535314 (+ 17us) write_op.cc:276] Timestamp: P: 1777514518357561 usec, L: 0
0430 02:02:04.535443 (+ 129us) write_op.cc:270] Start()
0430 02:02:04.535461 (+ 18us) write_op.cc:276] Timestamp: P: 1777514518371222 usec, L: 0
0430 02:02:04.535594 (+ 133us) write_op.cc:270] Start()
0430 02:02:04.535610 (+ 16us) write_op.cc:276] Timestamp: P: 1777514518383048 usec, L: 0
0430 02:02:04.535787 (+ 177us) write_op.cc:270] Start()
0430 02:02:04.535803 (+ 16us) write_op.cc:276] Timestamp: P: 1777514518383666 usec, L: 0
0430 02:02:04.535927 (+ 124us) write_op.cc:270] Start()
0430 02:02:04.535940 (+ 13us) write_op.cc:276] Timestamp: P: 1777514518408193 usec, L: 0
0430 02:02:04.536063 (+ 123us) write_op.cc:270] Start()
0430 02:02:04.536078 (+ 15us) write_op.cc:276] Timestamp: P: 1777514518414503 usec, L: 0
0430 02:02:04.536257 (+ 179us) write_op.cc:270] Start()
0430 02:02:04.536276 (+ 19us) write_op.cc:276] Timestamp: P: 1777514518420278 usec, L: 0
0430 02:02:04.539010 (+ 2734us) write_op.cc:270] Start()
0430 02:02:04.539032 (+ 22us) write_op.cc:276] Timestamp: P: 1777514518437832 usec, L: 0
0430 02:02:04.539162 (+ 130us) write_op.cc:270] Start()
0430 02:02:04.539177 (+ 15us) write_op.cc:276] Timestamp: P: 1777514518446398 usec, L: 0
0430 02:02:04.539335 (+ 158us) write_op.cc:270] Start()
0430 02:02:04.539350 (+ 15us) write_op.cc:276] Timestamp: P: 1777514518468506 usec, L: 0
0430 02:02:04.539495 (+ 145us) write_op.cc:270] Start()
0430 02:02:04.539508 (+ 13us) write_op.cc:276] Timestamp: P: 1777514518469044 usec, L: 0
0430 02:02:04.539629 (+ 121us) write_op.cc:270] Start()
0430 02:02:04.539645 (+ 16us) write_op.cc:276] Timestamp: P: 1777514518471379 usec, L: 0
0430 02:02:04.540053 (+ 408us) write_op.cc:270] Start()
0430 02:02:04.540069 (+ 16us) write_op.cc:276] Timestamp: P: 1777514518508811 usec, L: 0
0430 02:02:04.540213 (+ 144us) write_op.cc:270] Start()
0430 02:02:04.540225 (+ 12us) write_op.cc:276] Timestamp: P: 1777514518509563 usec, L: 0
0430 02:02:04.540343 (+ 118us) write_op.cc:270] Start()
0430 02:02:04.540357 (+ 14us) write_op.cc:276] Timestamp: P: 1777514518526646 usec, L: 0
0430 02:02:04.540496 (+ 139us) write_op.cc:270] Start()
0430 02:02:04.540513 (+ 17us) write_op.cc:276] Timestamp: P: 1777514518527684 usec, L: 0
0430 02:02:04.540640 (+ 127us) write_op.cc:270] Start()
0430 02:02:04.540658 (+ 18us) write_op.cc:276] Timestamp: P: 1777514518550339 usec, L: 0
0430 02:02:04.540800 (+ 142us) write_op.cc:270] Start()
0430 02:02:04.540816 (+ 16us) write_op.cc:276] Timestamp: P: 1777514518563282 usec, L: 0
0430 02:02:04.540952 (+ 136us) write_op.cc:270] Start()
0430 02:02:04.540966 (+ 14us) write_op.cc:276] Timestamp: P: 1777514518564050 usec, L: 0
0430 02:02:04.541084 (+ 118us) write_op.cc:270] Start()
0430 02:02:04.541097 (+ 13us) write_op.cc:276] Timestamp: P: 17775145
I20260430 02:02:05.709815 26219 raft_consensus.cc:493] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 3 FOLLOWER]: Starting pre-election (detected failure of leader a4ec6dffeb11435b8655672771cd29c4)
I20260430 02:02:05.709960 26219 raft_consensus.cc:515] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 3 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } }
I20260430 02:02:05.710434 26219 leader_election.cc:290] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 4 pre-election: Requested pre-vote from peers f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745), a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:02:05.711135 25631 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 4 candidate_status { last_received { term: 3 index: 741 } } ignore_live_leader: false dest_uuid: "a4ec6dffeb11435b8655672771cd29c4" is_pre_election: true
I20260430 02:02:05.711810 25498 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fbfe4b6b54594e36a76de6e54d2adb8c" candidate_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" candidate_term: 4 candidate_status { last_received { term: 3 index: 741 } } ignore_live_leader: false dest_uuid: "f5202ea2c8244e849a11073ee5d918c5" is_pre_election: true
I20260430 02:02:05.712459 26100 leader_election.cc:304] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [CANDIDATE]: Term 4 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e481cdb11e4040f2b78a224a2bed5eaa; no voters: a4ec6dffeb11435b8655672771cd29c4, f5202ea2c8244e849a11073ee5d918c5
I20260430 02:02:05.712750 26219 raft_consensus.cc:2749] T fbfe4b6b54594e36a76de6e54d2adb8c P e481cdb11e4040f2b78a224a2bed5eaa [term 3 FOLLOWER]: Leader pre-election lost for term 4. Reason: could not achieve majority
I20260430 02:02:06.191766 26003 ts_manager.cc:284] Unset tserver state for e481cdb11e4040f2b78a224a2bed5eaa from MAINTENANCE_MODE
I20260430 02:02:06.660482 25926 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:02:06.707813 25547 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:02:06.750327 25678 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:02:06.763036 26211 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:02:09.311440 26003 ts_manager.cc:295] Set tserver state for e481cdb11e4040f2b78a224a2bed5eaa to MAINTENANCE_MODE
I20260430 02:02:09.312793 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 26074
W20260430 02:02:09.361888 25437 connection.cc:570] client connection to 127.24.153.65:44971 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260430 02:02:09.362030 25437 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107) [suppressed 37 similar messages]
W20260430 02:02:09.362695 25568 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv got EOF from 127.24.153.65:44971 (error 108) [suppressed 74 similar messages]
W20260430 02:02:09.376147 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:09.376248 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:09.376288 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:09.376322 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:09.402733 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:09.402864 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:09.795591 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:09.831748 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:09.862711 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:09.885217 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:09.890882 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:09.896443 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:10.343166 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:02:10.343338 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:02:10.363415 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:02:10.376201 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:02:10.388899 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:02:10.398682 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:02:10.833204 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260430 02:02:10.841636 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260430 02:02:10.864028 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260430 02:02:10.885439 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260430 02:02:10.905827 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260430 02:02:10.916153 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
I20260430 02:02:11.330726 25956 consensus_queue.cc:579] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.027s)
I20260430 02:02:11.364935 26265 consensus_queue.cc:579] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.078s)
W20260430 02:02:11.368811 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260430 02:02:11.369014 26295 consensus_queue.cc:579] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.064s)
W20260430 02:02:11.376389 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260430 02:02:11.388938 25956 consensus_queue.cc:579] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.086s)
I20260430 02:02:11.391245 25770 consensus_queue.cc:579] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.089s)
W20260430 02:02:11.397121 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260430 02:02:11.402477 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260430 02:02:11.414268 26255 consensus_queue.cc:579] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Leader has been unable to successfully communicate with peer e481cdb11e4040f2b78a224a2bed5eaa for more than 2 seconds (2.109s)
W20260430 02:02:11.417183 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260430 02:02:11.449525 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260430 02:02:11.857081 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
I20260430 02:02:11.874596 26003 ts_manager.cc:284] Unset tserver state for e481cdb11e4040f2b78a224a2bed5eaa from MAINTENANCE_MODE
W20260430 02:02:11.884346 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260430 02:02:11.892742 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260430 02:02:11.903009 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260430 02:02:11.907903 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260430 02:02:11.998570 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260430 02:02:12.346530 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260430 02:02:12.374985 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260430 02:02:12.389969 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260430 02:02:12.395157 25678 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
W20260430 02:02:12.407406 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260430 02:02:12.415870 25629 consensus_queue.cc:237] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1277, Committed index: 1277, Last appended: 1.1277, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } }
I20260430 02:02:12.416195 25630 consensus_queue.cc:237] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1277, Committed index: 1277, Last appended: 2.1278, Last appended by leader: 70, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 1279 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } }
I20260430 02:02:12.415870 25632 consensus_queue.cc:237] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1277, Committed index: 1277, Last appended: 1.1277, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } }
I20260430 02:02:12.415876 25631 consensus_queue.cc:237] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1277, Committed index: 1277, Last appended: 3.1277, Last appended by leader: 68, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } }
I20260430 02:02:12.420624 25499 raft_consensus.cc:1275] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 2 FOLLOWER]: Refusing update from remote peer a4ec6dffeb11435b8655672771cd29c4: Log matching property violated. Preceding OpId in replica: term: 2 index: 1278. Preceding OpId from leader: term: 2 index: 1279. (index mismatch)
I20260430 02:02:12.421017 25500 raft_consensus.cc:1275] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 3 FOLLOWER]: Refusing update from remote peer a4ec6dffeb11435b8655672771cd29c4: Log matching property violated. Preceding OpId in replica: term: 3 index: 1277. Preceding OpId from leader: term: 3 index: 1278. (index mismatch)
I20260430 02:02:12.420675 25497 raft_consensus.cc:1275] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Refusing update from remote peer a4ec6dffeb11435b8655672771cd29c4: Log matching property violated. Preceding OpId in replica: term: 1 index: 1277. Preceding OpId from leader: term: 1 index: 1278. (index mismatch)
I20260430 02:02:12.421643 25501 raft_consensus.cc:1275] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Refusing update from remote peer a4ec6dffeb11435b8655672771cd29c4: Log matching property violated. Preceding OpId in replica: term: 1 index: 1277. Preceding OpId from leader: term: 1 index: 1278. (index mismatch)
I20260430 02:02:12.422657 26292 consensus_queue.cc:1048] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1278, Last known committed idx: 1277, Time since last communication: 0.000s
I20260430 02:02:12.423034 26286 consensus_queue.cc:1048] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1278, Last known committed idx: 1277, Time since last communication: 0.000s
I20260430 02:02:12.423571 26286 consensus_queue.cc:1048] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1279, Last known committed idx: 1277, Time since last communication: 0.000s
I20260430 02:02:12.423870 26286 consensus_queue.cc:1048] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1278, Last known committed idx: 1277, Time since last communication: 0.000s
I20260430 02:02:12.429446 25547 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
W20260430 02:02:12.431782 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:12.431936 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:12.432015 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:12.432067 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:12.434659 25565 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 2c080a5b649f453c903e1dce5ab6a113. This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:12.434806 25565 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 29e24069ebd04300a2c4cf4d4bdc5e66. This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:12.434871 25565 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 869a29991dbe4537bf1082d1c9ee2ecd. This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:12.434927 25565 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: fbfe4b6b54594e36a76de6e54d2adb8c. This is attempt 1: this message will repeat every 5th retry.
I20260430 02:02:12.440478 26286 raft_consensus.cc:2955] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 [term 2 LEADER]: Committing config change with OpId 2.1279: config changed from index -1 to 1279, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1279 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.441009 26293 raft_consensus.cc:2955] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 [term 3 LEADER]: Committing config change with OpId 3.1278: config changed from index -1 to 1278, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.441690 25499 raft_consensus.cc:2955] T 29e24069ebd04300a2c4cf4d4bdc5e66 P f5202ea2c8244e849a11073ee5d918c5 [term 2 FOLLOWER]: Committing config change with OpId 2.1279: config changed from index -1 to 1279, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1279 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.443538 26294 raft_consensus.cc:2955] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 [term 1 LEADER]: Committing config change with OpId 1.1278: config changed from index -1 to 1278, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.446475 25497 consensus_queue.cc:237] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1280, Committed index: 1280, Last appended: 2.1281, Last appended by leader: 68, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 1282 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } }
I20260430 02:02:12.446534 26003 catalog_manager.cc:5671] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 reported cstate change: config changed from index -1 to 1279, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New cstate: current_term: 2 leader_uuid: "a4ec6dffeb11435b8655672771cd29c4" committed_config { opid_index: 1279 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20260430 02:02:12.449123 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260430 02:02:12.449330 25966 raft_consensus.cc:2955] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 [term 1 LEADER]: Committing config change with OpId 1.1278: config changed from index -1 to 1278, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.450356 25629 raft_consensus.cc:1275] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Refusing update from remote peer f5202ea2c8244e849a11073ee5d918c5: Log matching property violated. Preceding OpId in replica: term: 2 index: 1281. Preceding OpId from leader: term: 2 index: 1282. (index mismatch)
I20260430 02:02:12.450872 26289 consensus_queue.cc:1048] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1282, Last known committed idx: 1280, Time since last communication: 0.000s
I20260430 02:02:12.453130 26291 raft_consensus.cc:2955] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 [term 2 LEADER]: Committing config change with OpId 2.1282: config changed from index -1 to 1282, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1282 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.454473 25501 consensus_queue.cc:237] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1280, Committed index: 1280, Last appended: 1.1280, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 1281 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } }
I20260430 02:02:12.455263 25992 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet f1dbbbee06674a93a3dd31c45c90d59d with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260430 02:02:12.457306 25630 raft_consensus.cc:1275] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Refusing update from remote peer f5202ea2c8244e849a11073ee5d918c5: Log matching property violated. Preceding OpId in replica: term: 1 index: 1280. Preceding OpId from leader: term: 1 index: 1281. (index mismatch)
I20260430 02:02:12.457701 25629 raft_consensus.cc:2955] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 [term 2 FOLLOWER]: Committing config change with OpId 2.1282: config changed from index -1 to 1282, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1282 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.457854 26290 consensus_queue.cc:1048] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1281, Last known committed idx: 1280, Time since last communication: 0.000s
I20260430 02:02:12.457726 26003 catalog_manager.cc:5671] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 reported cstate change: config changed from index -1 to 1278, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New cstate: current_term: 3 leader_uuid: "a4ec6dffeb11435b8655672771cd29c4" committed_config { opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20260430 02:02:12.458490 26003 catalog_manager.cc:5671] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 reported cstate change: config changed from index -1 to 1278, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New cstate: current_term: 1 leader_uuid: "a4ec6dffeb11435b8655672771cd29c4" committed_config { opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20260430 02:02:12.458801 26003 catalog_manager.cc:5671] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 reported cstate change: config changed from index -1 to 1278, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New cstate: current_term: 1 leader_uuid: "a4ec6dffeb11435b8655672771cd29c4" committed_config { opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20260430 02:02:12.459992 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260430 02:02:12.459880 26255 raft_consensus.cc:2955] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 [term 1 LEADER]: Committing config change with OpId 1.1281: config changed from index -1 to 1281, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1281 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.460762 25629 raft_consensus.cc:2955] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4 [term 1 FOLLOWER]: Committing config change with OpId 1.1281: config changed from index -1 to 1281, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1281 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.461371 25992 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 63c4457448ec4b1b8a0741a7560cddfe with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260430 02:02:12.463040 25991 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 29e24069ebd04300a2c4cf4d4bdc5e66 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260430 02:02:12.463249 25991 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet fbfe4b6b54594e36a76de6e54d2adb8c with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260430 02:02:12.463392 25991 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 2c080a5b649f453c903e1dce5ab6a113 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260430 02:02:12.463490 25991 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 869a29991dbe4537bf1082d1c9ee2ecd with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260430 02:02:12.464777 26003 catalog_manager.cc:5671] T f1dbbbee06674a93a3dd31c45c90d59d P a4ec6dffeb11435b8655672771cd29c4 reported cstate change: config changed from index -1 to 1282, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New cstate: current_term: 2 leader_uuid: "f5202ea2c8244e849a11073ee5d918c5" committed_config { opid_index: 1282 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.465030 26003 catalog_manager.cc:5671] T 63c4457448ec4b1b8a0741a7560cddfe P a4ec6dffeb11435b8655672771cd29c4 reported cstate change: config changed from index -1 to 1281, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New cstate: current_term: 1 leader_uuid: "f5202ea2c8244e849a11073ee5d918c5" committed_config { opid_index: 1281 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.467545 25499 raft_consensus.cc:2955] T fbfe4b6b54594e36a76de6e54d2adb8c P f5202ea2c8244e849a11073ee5d918c5 [term 3 FOLLOWER]: Committing config change with OpId 3.1278: config changed from index -1 to 1278, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.468930 25501 raft_consensus.cc:2955] T 869a29991dbe4537bf1082d1c9ee2ecd P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Committing config change with OpId 1.1278: config changed from index -1 to 1278, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
I20260430 02:02:12.472299 25497 raft_consensus.cc:2955] T 2c080a5b649f453c903e1dce5ab6a113 P f5202ea2c8244e849a11073ee5d918c5 [term 1 FOLLOWER]: Committing config change with OpId 1.1278: config changed from index -1 to 1278, NON_VOTER 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68) added. New config: { opid_index: 1278 OBSOLETE_local: false peers { permanent_uuid: "e481cdb11e4040f2b78a224a2bed5eaa" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 44971 } } peers { permanent_uuid: "f5202ea2c8244e849a11073ee5d918c5" member_type: VOTER last_known_addr { host: "127.24.153.66" port: 43745 } } peers { permanent_uuid: "a4ec6dffeb11435b8655672771cd29c4" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 34223 } } peers { permanent_uuid: "7f82aeb1ab544f90b83142774323b0a3" member_type: NON_VOTER last_known_addr { host: "127.24.153.68" port: 45771 } attrs { promote: true } } }
W20260430 02:02:12.480012 25434 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: f1dbbbee06674a93a3dd31c45c90d59d. This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:12.480414 25434 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 63c4457448ec4b1b8a0741a7560cddfe. This is attempt 1: this message will repeat every 5th retry.
I20260430 02:02:12.510530 26310 ts_tablet_manager.cc:933] T 2c080a5b649f453c903e1dce5ab6a113 P 7f82aeb1ab544f90b83142774323b0a3: Initiating tablet copy from peer a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:02:12.511332 26310 tablet_copy_client.cc:323] T 2c080a5b649f453c903e1dce5ab6a113 P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Beginning tablet copy session from remote peer at address 127.24.153.67:34223
I20260430 02:02:12.524624 26313 ts_tablet_manager.cc:933] T fbfe4b6b54594e36a76de6e54d2adb8c P 7f82aeb1ab544f90b83142774323b0a3: Initiating tablet copy from peer a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:02:12.525167 26313 tablet_copy_client.cc:323] T fbfe4b6b54594e36a76de6e54d2adb8c P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Beginning tablet copy session from remote peer at address 127.24.153.67:34223
I20260430 02:02:12.528828 26315 ts_tablet_manager.cc:933] T 29e24069ebd04300a2c4cf4d4bdc5e66 P 7f82aeb1ab544f90b83142774323b0a3: Initiating tablet copy from peer a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:02:12.533667 25651 tablet_copy_service.cc:140] P a4ec6dffeb11435b8655672771cd29c4: Received BeginTabletCopySession request for tablet fbfe4b6b54594e36a76de6e54d2adb8c from peer 7f82aeb1ab544f90b83142774323b0a3 ({username='slave'} at 127.24.153.68:38947)
I20260430 02:02:12.533834 25651 tablet_copy_service.cc:161] P a4ec6dffeb11435b8655672771cd29c4: Beginning new tablet copy session on tablet fbfe4b6b54594e36a76de6e54d2adb8c from peer 7f82aeb1ab544f90b83142774323b0a3 at {username='slave'} at 127.24.153.68:38947: session id = 7f82aeb1ab544f90b83142774323b0a3-fbfe4b6b54594e36a76de6e54d2adb8c
I20260430 02:02:12.535465 25651 tablet_copy_source_session.cc:215] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4: Tablet Copy: opened 0 blocks and 1 log segments
I20260430 02:02:12.535656 26315 tablet_copy_client.cc:323] T 29e24069ebd04300a2c4cf4d4bdc5e66 P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Beginning tablet copy session from remote peer at address 127.24.153.67:34223
I20260430 02:02:12.533670 25652 tablet_copy_service.cc:140] P a4ec6dffeb11435b8655672771cd29c4: Received BeginTabletCopySession request for tablet 2c080a5b649f453c903e1dce5ab6a113 from peer 7f82aeb1ab544f90b83142774323b0a3 ({username='slave'} at 127.24.153.68:38947)
I20260430 02:02:12.537179 25652 tablet_copy_service.cc:161] P a4ec6dffeb11435b8655672771cd29c4: Beginning new tablet copy session on tablet 2c080a5b649f453c903e1dce5ab6a113 from peer 7f82aeb1ab544f90b83142774323b0a3 at {username='slave'} at 127.24.153.68:38947: session id = 7f82aeb1ab544f90b83142774323b0a3-2c080a5b649f453c903e1dce5ab6a113
I20260430 02:02:12.538133 25652 tablet_copy_source_session.cc:215] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4: Tablet Copy: opened 0 blocks and 1 log segments
I20260430 02:02:12.541523 25652 tablet_copy_service.cc:140] P a4ec6dffeb11435b8655672771cd29c4: Received BeginTabletCopySession request for tablet 29e24069ebd04300a2c4cf4d4bdc5e66 from peer 7f82aeb1ab544f90b83142774323b0a3 ({username='slave'} at 127.24.153.68:38947)
I20260430 02:02:12.541662 25652 tablet_copy_service.cc:161] P a4ec6dffeb11435b8655672771cd29c4: Beginning new tablet copy session on tablet 29e24069ebd04300a2c4cf4d4bdc5e66 from peer 7f82aeb1ab544f90b83142774323b0a3 at {username='slave'} at 127.24.153.68:38947: session id = 7f82aeb1ab544f90b83142774323b0a3-29e24069ebd04300a2c4cf4d4bdc5e66
I20260430 02:02:12.542666 25652 tablet_copy_source_session.cc:215] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4: Tablet Copy: opened 0 blocks and 1 log segments
I20260430 02:02:12.543680 26313 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fbfe4b6b54594e36a76de6e54d2adb8c. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:02:12.545763 26310 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 2c080a5b649f453c903e1dce5ab6a113. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:02:12.550165 26317 ts_tablet_manager.cc:933] T 869a29991dbe4537bf1082d1c9ee2ecd P 7f82aeb1ab544f90b83142774323b0a3: Initiating tablet copy from peer a4ec6dffeb11435b8655672771cd29c4 (127.24.153.67:34223)
I20260430 02:02:12.550647 26317 tablet_copy_client.cc:323] T 869a29991dbe4537bf1082d1c9ee2ecd P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Beginning tablet copy session from remote peer at address 127.24.153.67:34223
I20260430 02:02:12.551244 25652 tablet_copy_service.cc:140] P a4ec6dffeb11435b8655672771cd29c4: Received BeginTabletCopySession request for tablet 869a29991dbe4537bf1082d1c9ee2ecd from peer 7f82aeb1ab544f90b83142774323b0a3 ({username='slave'} at 127.24.153.68:38947)
I20260430 02:02:12.551365 25652 tablet_copy_service.cc:161] P a4ec6dffeb11435b8655672771cd29c4: Beginning new tablet copy session on tablet 869a29991dbe4537bf1082d1c9ee2ecd from peer 7f82aeb1ab544f90b83142774323b0a3 at {username='slave'} at 127.24.153.68:38947: session id = 7f82aeb1ab544f90b83142774323b0a3-869a29991dbe4537bf1082d1c9ee2ecd
I20260430 02:02:12.552343 25652 tablet_copy_source_session.cc:215] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4: Tablet Copy: opened 0 blocks and 1 log segments
I20260430 02:02:12.553648 26319 ts_tablet_manager.cc:933] T 63c4457448ec4b1b8a0741a7560cddfe P 7f82aeb1ab544f90b83142774323b0a3: Initiating tablet copy from peer f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:02:12.554056 26319 tablet_copy_client.cc:323] T 63c4457448ec4b1b8a0741a7560cddfe P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Beginning tablet copy session from remote peer at address 127.24.153.66:43745
I20260430 02:02:12.564565 26310 tablet_copy_client.cc:806] T 2c080a5b649f453c903e1dce5ab6a113 P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 0 data blocks...
I20260430 02:02:12.564947 26310 tablet_copy_client.cc:670] T 2c080a5b649f453c903e1dce5ab6a113 P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 1 WAL segments...
I20260430 02:02:12.565440 25521 tablet_copy_service.cc:140] P f5202ea2c8244e849a11073ee5d918c5: Received BeginTabletCopySession request for tablet 63c4457448ec4b1b8a0741a7560cddfe from peer 7f82aeb1ab544f90b83142774323b0a3 ({username='slave'} at 127.24.153.68:35891)
I20260430 02:02:12.565578 25521 tablet_copy_service.cc:161] P f5202ea2c8244e849a11073ee5d918c5: Beginning new tablet copy session on tablet 63c4457448ec4b1b8a0741a7560cddfe from peer 7f82aeb1ab544f90b83142774323b0a3 at {username='slave'} at 127.24.153.68:35891: session id = 7f82aeb1ab544f90b83142774323b0a3-63c4457448ec4b1b8a0741a7560cddfe
I20260430 02:02:12.566622 26313 tablet_copy_client.cc:806] T fbfe4b6b54594e36a76de6e54d2adb8c P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 0 data blocks...
I20260430 02:02:12.566677 26317 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 869a29991dbe4537bf1082d1c9ee2ecd. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:02:12.566781 26313 tablet_copy_client.cc:670] T fbfe4b6b54594e36a76de6e54d2adb8c P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 1 WAL segments...
I20260430 02:02:12.567173 25521 tablet_copy_source_session.cc:215] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5: Tablet Copy: opened 0 blocks and 1 log segments
I20260430 02:02:12.567379 26315 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 29e24069ebd04300a2c4cf4d4bdc5e66. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:02:12.568782 26319 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 63c4457448ec4b1b8a0741a7560cddfe. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:02:12.569655 26317 tablet_copy_client.cc:806] T 869a29991dbe4537bf1082d1c9ee2ecd P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 0 data blocks...
I20260430 02:02:12.569833 26317 tablet_copy_client.cc:670] T 869a29991dbe4537bf1082d1c9ee2ecd P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 1 WAL segments...
I20260430 02:02:12.572515 26319 tablet_copy_client.cc:806] T 63c4457448ec4b1b8a0741a7560cddfe P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 0 data blocks...
I20260430 02:02:12.572773 26315 tablet_copy_client.cc:806] T 29e24069ebd04300a2c4cf4d4bdc5e66 P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 0 data blocks...
I20260430 02:02:12.573017 26315 tablet_copy_client.cc:670] T 29e24069ebd04300a2c4cf4d4bdc5e66 P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 1 WAL segments...
I20260430 02:02:12.573700 26319 tablet_copy_client.cc:670] T 63c4457448ec4b1b8a0741a7560cddfe P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 1 WAL segments...
I20260430 02:02:12.574038 26321 ts_tablet_manager.cc:933] T f1dbbbee06674a93a3dd31c45c90d59d P 7f82aeb1ab544f90b83142774323b0a3: Initiating tablet copy from peer f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745)
I20260430 02:02:12.575958 26321 tablet_copy_client.cc:323] T f1dbbbee06674a93a3dd31c45c90d59d P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Beginning tablet copy session from remote peer at address 127.24.153.66:43745
I20260430 02:02:12.576553 25520 tablet_copy_service.cc:140] P f5202ea2c8244e849a11073ee5d918c5: Received BeginTabletCopySession request for tablet f1dbbbee06674a93a3dd31c45c90d59d from peer 7f82aeb1ab544f90b83142774323b0a3 ({username='slave'} at 127.24.153.68:35891)
I20260430 02:02:12.576678 25520 tablet_copy_service.cc:161] P f5202ea2c8244e849a11073ee5d918c5: Beginning new tablet copy session on tablet f1dbbbee06674a93a3dd31c45c90d59d from peer 7f82aeb1ab544f90b83142774323b0a3 at {username='slave'} at 127.24.153.68:35891: session id = 7f82aeb1ab544f90b83142774323b0a3-f1dbbbee06674a93a3dd31c45c90d59d
I20260430 02:02:12.577704 25520 tablet_copy_source_session.cc:215] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5: Tablet Copy: opened 0 blocks and 1 log segments
I20260430 02:02:12.585675 26321 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f1dbbbee06674a93a3dd31c45c90d59d. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:02:12.590387 26321 tablet_copy_client.cc:806] T f1dbbbee06674a93a3dd31c45c90d59d P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 0 data blocks...
I20260430 02:02:12.594396 26321 tablet_copy_client.cc:670] T f1dbbbee06674a93a3dd31c45c90d59d P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Starting download of 1 WAL segments...
I20260430 02:02:12.598069 26319 tablet_copy_client.cc:538] T 63c4457448ec4b1b8a0741a7560cddfe P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260430 02:02:12.600878 26319 tablet_bootstrap.cc:492] T 63c4457448ec4b1b8a0741a7560cddfe P 7f82aeb1ab544f90b83142774323b0a3: Bootstrap starting.
I20260430 02:02:12.608820 26310 tablet_copy_client.cc:538] T 2c080a5b649f453c903e1dce5ab6a113 P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260430 02:02:12.611711 26310 tablet_bootstrap.cc:492] T 2c080a5b649f453c903e1dce5ab6a113 P 7f82aeb1ab544f90b83142774323b0a3: Bootstrap starting.
I20260430 02:02:12.612123 26313 tablet_copy_client.cc:538] T fbfe4b6b54594e36a76de6e54d2adb8c P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260430 02:02:12.618198 26313 tablet_bootstrap.cc:492] T fbfe4b6b54594e36a76de6e54d2adb8c P 7f82aeb1ab544f90b83142774323b0a3: Bootstrap starting.
I20260430 02:02:12.634101 26315 tablet_copy_client.cc:538] T 29e24069ebd04300a2c4cf4d4bdc5e66 P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260430 02:02:12.635165 26321 tablet_copy_client.cc:538] T f1dbbbee06674a93a3dd31c45c90d59d P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260430 02:02:12.636852 26321 tablet_bootstrap.cc:492] T f1dbbbee06674a93a3dd31c45c90d59d P 7f82aeb1ab544f90b83142774323b0a3: Bootstrap starting.
I20260430 02:02:12.642823 26315 tablet_bootstrap.cc:492] T 29e24069ebd04300a2c4cf4d4bdc5e66 P 7f82aeb1ab544f90b83142774323b0a3: Bootstrap starting.
I20260430 02:02:12.677371 25926 heartbeater.cc:507] Master 127.24.153.126:40723 requested a full tablet report, sending...
I20260430 02:02:12.683805 26317 tablet_copy_client.cc:538] T 869a29991dbe4537bf1082d1c9ee2ecd P 7f82aeb1ab544f90b83142774323b0a3: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260430 02:02:12.686004 26317 tablet_bootstrap.cc:492] T 869a29991dbe4537bf1082d1c9ee2ecd P 7f82aeb1ab544f90b83142774323b0a3: Bootstrap starting.
I20260430 02:02:12.835819 26313 log.cc:826] T fbfe4b6b54594e36a76de6e54d2adb8c P 7f82aeb1ab544f90b83142774323b0a3: Log is configured to *not* fsync() on all Append() calls
W20260430 02:02:12.853628 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:12.892386 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:12.906246 25568 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:12.953724 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:12.961345 25437 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:12.998963 25437 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:13.007083 25565 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:13.020663 25565 consensus_peers.cc:597] T 2c080a5b649f453c903e1dce5ab6a113 P a4ec6dffeb11435b8655672771cd29c4 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:13.052229 25434 consensus_peers.cc:597] T f1dbbbee06674a93a3dd31c45c90d59d P f5202ea2c8244e849a11073ee5d918c5 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:13.067345 25565 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:13.094321 25565 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:13.096932 25434 consensus_peers.cc:597] T 63c4457448ec4b1b8a0741a7560cddfe P f5202ea2c8244e849a11073ee5d918c5 -> Peer 7f82aeb1ab544f90b83142774323b0a3 (127.24.153.68:45771): Couldn't send request to peer 7f82aeb1ab544f90b83142774323b0a3. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 6: this message will repeat every 5th retry.
W20260430 02:02:13.279886 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
I20260430 02:02:13.295140 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 25419
W20260430 02:02:13.335207 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer e481cdb11e4040f2b78a224a2bed5eaa (127.24.153.65:44971): Couldn't send request to peer e481cdb11e4040f2b78a224a2bed5eaa. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.65:44971: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260430 02:02:13.348147 25568 connection.cc:570] client connection to 127.24.153.66:43745 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260430 02:02:13.350143 26305 negotiation.cc:336] Failed RPC negotiation. Trace:
0430 02:02:13.348989 (+ 0us) reactor.cc:678] Submitting negotiation task for client connection to 127.24.153.66:43745 (local address 127.24.153.67:55609)
0430 02:02:13.349407 (+ 418us) negotiation.cc:107] Waiting for socket to connect
0430 02:02:13.349461 (+ 54us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.24.153.66:43745: connect: Connection reset by peer (error 104)
Metrics: {"client-negotiator.queue_time_us":314}
W20260430 02:02:13.350577 25568 consensus_peers.cc:597] T 869a29991dbe4537bf1082d1c9ee2ecd P a4ec6dffeb11435b8655672771cd29c4 -> Peer f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745): Couldn't send request to peer f5202ea2c8244e849a11073ee5d918c5. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.66:43745: connect: Connection reset by peer (error 104). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:13.350667 25568 consensus_peers.cc:597] T fbfe4b6b54594e36a76de6e54d2adb8c P a4ec6dffeb11435b8655672771cd29c4 -> Peer f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745): Couldn't send request to peer f5202ea2c8244e849a11073ee5d918c5. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.66:43745: connect: Connection reset by peer (error 104). This is attempt 1: this message will repeat every 5th retry.
W20260430 02:02:13.350708 25568 consensus_peers.cc:597] T 29e24069ebd04300a2c4cf4d4bdc5e66 P a4ec6dffeb11435b8655672771cd29c4 -> Peer f5202ea2c8244e849a11073ee5d918c5 (127.24.153.66:43745): Couldn't send request to peer f5202ea2c8244e849a11073ee5d918c5. Status: Network error: Client connection negotiation failed: client connection to 127.24.153.66:43745: connect: Connection reset by peer (error 104). This is attempt 1: this message will repeat every 5th retry.
I20260430 02:02:13.366158 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 25550
I20260430 02:02:13.382179 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 25752
I20260430 02:02:13.389415 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 25974
2026-04-30T02:02:13Z chronyd exiting
[ OK ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate (19585 ms)
[----------] 1 test from MaintenanceModeRF3ITest (19585 ms total)
[----------] 1 test from RollingRestartArgs/RollingRestartITest
[ RUN ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4
2026-04-30T02:02:13Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-04-30T02:02:13Z Disabled control of system clock
I20260430 02:02:13.436836 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.24.153.126:41269
--webserver_interface=127.24.153.126
--webserver_port=0
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.24.153.126:41269
--location_mapping_cmd=/tmp/dist-test-taskfXPN2o/build/debug/bin/testdata/assign-location.py --state_store=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/location-assignment.state --map /L0:4
--master_client_location_assignment_enabled=false with env {}
W20260430 02:02:13.560340 26341 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:13.560662 26341 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:13.560707 26341 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:13.565258 26341 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260430 02:02:13.565372 26341 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:13.565395 26341 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260430 02:02:13.565414 26341 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260430 02:02:13.570077 26341 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/wal
--location_mapping_cmd=/tmp/dist-test-taskfXPN2o/build/debug/bin/testdata/assign-location.py --state_store=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/location-assignment.state --map /L0:4
--ipki_ca_key_size=768
--master_addresses=127.24.153.126:41269
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.24.153.126:41269
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.24.153.126
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.26341
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:13.571333 26341 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:13.572582 26341 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260430 02:02:13.581903 26346 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:13.581902 26349 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:13.581904 26347 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:13.582512 26341 server_base.cc:1061] running on GCE node
I20260430 02:02:13.583292 26341 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:13.584383 26341 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:13.585606 26341 hybrid_clock.cc:648] HybridClock initialized: now 1777514533585565 us; error 82 us; skew 500 ppm
I20260430 02:02:13.588157 26341 webserver.cc:492] Webserver started at http://127.24.153.126:45785/ using document root <none> and password file <none>
I20260430 02:02:13.588889 26341 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:13.588985 26341 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:13.589224 26341 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260430 02:02:13.591109 26341 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/data/instance:
uuid: "42db9cb347494d76ab96d630881bc311"
format_stamp: "Formatted at 2026-04-30 02:02:13 on dist-test-slave-f7mg"
I20260430 02:02:13.591684 26341 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/wal/instance:
uuid: "42db9cb347494d76ab96d630881bc311"
format_stamp: "Formatted at 2026-04-30 02:02:13 on dist-test-slave-f7mg"
I20260430 02:02:13.595713 26341 fs_manager.cc:696] Time spent creating directory manager: real 0.004s user 0.005s sys 0.000s
I20260430 02:02:13.599285 26355 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:13.601001 26341 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.004s sys 0.000s
I20260430 02:02:13.601291 26341 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/wal
uuid: "42db9cb347494d76ab96d630881bc311"
format_stamp: "Formatted at 2026-04-30 02:02:13 on dist-test-slave-f7mg"
I20260430 02:02:13.601642 26341 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:13.636971 26341 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:13.637717 26341 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:13.638012 26341 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:13.646612 26341 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.126:41269
I20260430 02:02:13.646636 26407 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.126:41269 every 8 connection(s)
I20260430 02:02:13.648284 26341 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/data/info.pb
I20260430 02:02:13.652143 26408 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:02:13.654693 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 26341
I20260430 02:02:13.654847 25189 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/master-0/wal/instance
I20260430 02:02:13.658293 26408 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311: Bootstrap starting.
I20260430 02:02:13.660964 26408 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311: Neither blocks nor log segments found. Creating new log.
I20260430 02:02:13.661921 26408 log.cc:826] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:13.664870 26408 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311: No bootstrap required, opened a new log
I20260430 02:02:13.668473 26408 raft_consensus.cc:359] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "42db9cb347494d76ab96d630881bc311" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 41269 } }
I20260430 02:02:13.668711 26408 raft_consensus.cc:385] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:02:13.668759 26408 raft_consensus.cc:740] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 42db9cb347494d76ab96d630881bc311, State: Initialized, Role: FOLLOWER
I20260430 02:02:13.669260 26408 consensus_queue.cc:260] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "42db9cb347494d76ab96d630881bc311" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 41269 } }
I20260430 02:02:13.669381 26408 raft_consensus.cc:399] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260430 02:02:13.669451 26408 raft_consensus.cc:493] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260430 02:02:13.669546 26408 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:02:13.670552 26408 raft_consensus.cc:515] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "42db9cb347494d76ab96d630881bc311" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 41269 } }
I20260430 02:02:13.670948 26408 leader_election.cc:304] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 42db9cb347494d76ab96d630881bc311; no voters:
I20260430 02:02:13.671288 26408 leader_election.cc:290] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [CANDIDATE]: Term 1 election: Requested vote from peers
I20260430 02:02:13.671480 26413 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [term 1 FOLLOWER]: Leader election won for term 1
I20260430 02:02:13.671764 26413 raft_consensus.cc:697] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [term 1 LEADER]: Becoming Leader. State: Replica: 42db9cb347494d76ab96d630881bc311, State: Running, Role: LEADER
I20260430 02:02:13.672103 26413 consensus_queue.cc:237] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "42db9cb347494d76ab96d630881bc311" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 41269 } }
I20260430 02:02:13.672621 26408 sys_catalog.cc:565] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [sys.catalog]: configured and running, proceeding with master startup.
I20260430 02:02:13.674310 26415 sys_catalog.cc:455] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 42db9cb347494d76ab96d630881bc311. Latest consensus state: current_term: 1 leader_uuid: "42db9cb347494d76ab96d630881bc311" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "42db9cb347494d76ab96d630881bc311" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 41269 } } }
I20260430 02:02:13.674624 26415 sys_catalog.cc:458] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [sys.catalog]: This master's current role is: LEADER
I20260430 02:02:13.674151 26414 sys_catalog.cc:455] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "42db9cb347494d76ab96d630881bc311" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "42db9cb347494d76ab96d630881bc311" member_type: VOTER last_known_addr { host: "127.24.153.126" port: 41269 } } }
I20260430 02:02:13.674727 26414 sys_catalog.cc:458] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311 [sys.catalog]: This master's current role is: LEADER
I20260430 02:02:13.675287 26422 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260430 02:02:13.678684 26422 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260430 02:02:13.684834 26422 catalog_manager.cc:1357] Generated new cluster ID: ea5fd90dc6ca4d64a08862161027ee63
I20260430 02:02:13.684952 26422 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260430 02:02:13.699301 26422 catalog_manager.cc:1380] Generated new certificate authority record
I20260430 02:02:13.700260 26422 catalog_manager.cc:1514] Loading token signing keys...
I20260430 02:02:13.724244 26422 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 42db9cb347494d76ab96d630881bc311: Generated new TSK 0
I20260430 02:02:13.725003 26422 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260430 02:02:13.735934 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.65:0
--local_ip_for_outbound_sockets=127.24.153.65
--webserver_interface=127.24.153.65
--webserver_port=0
--tserver_master_addrs=127.24.153.126:41269
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260430 02:02:13.867218 26432 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:13.867498 26432 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:13.867553 26432 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:13.871984 26432 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:13.872140 26432 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.65
I20260430 02:02:13.877411 26432 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.65:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.24.153.65
--webserver_port=0
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.26432
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.65
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:13.878734 26432 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:13.879977 26432 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:13.883034 26432 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:13.888131 26440 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:13.888085 26438 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:13.888094 26437 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:13.888270 26432 server_base.cc:1061] running on GCE node
I20260430 02:02:13.888926 26432 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:13.889604 26432 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:13.890823 26432 hybrid_clock.cc:648] HybridClock initialized: now 1777514533890802 us; error 46 us; skew 500 ppm
I20260430 02:02:13.893255 26432 webserver.cc:492] Webserver started at http://127.24.153.65:41493/ using document root <none> and password file <none>
I20260430 02:02:13.893975 26432 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:13.894069 26432 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:13.894336 26432 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260430 02:02:13.896313 26432 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/instance:
uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
format_stamp: "Formatted at 2026-04-30 02:02:13 on dist-test-slave-f7mg"
I20260430 02:02:13.896889 26432 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal/instance:
uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
format_stamp: "Formatted at 2026-04-30 02:02:13 on dist-test-slave-f7mg"
I20260430 02:02:13.901317 26432 fs_manager.cc:696] Time spent creating directory manager: real 0.004s user 0.002s sys 0.004s
I20260430 02:02:13.904645 26446 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:13.906100 26432 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20260430 02:02:13.906270 26432 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
format_stamp: "Formatted at 2026-04-30 02:02:13 on dist-test-slave-f7mg"
I20260430 02:02:13.906399 26432 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:13.932281 26432 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:13.933077 26432 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:13.933311 26432 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:13.933974 26432 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:13.935145 26432 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:02:13.935230 26432 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:13.935303 26432 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:02:13.935353 26432 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:13.946029 26432 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.65:45451
I20260430 02:02:13.946049 26559 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.65:45451 every 8 connection(s)
I20260430 02:02:13.947062 26432 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
I20260430 02:02:13.956003 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 26432
I20260430 02:02:13.956161 25189 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal/instance
I20260430 02:02:13.957641 26560 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:13.957998 26560 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:13.958626 26560 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:13.958853 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.66:0
--local_ip_for_outbound_sockets=127.24.153.66
--webserver_interface=127.24.153.66
--webserver_port=0
--tserver_master_addrs=127.24.153.126:41269
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:14.011363 26370 ts_manager.cc:194] Registered new tserver with Master: 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:14.012794 26370 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.65:51333
W20260430 02:02:14.089521 26563 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:14.089799 26563 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:14.089882 26563 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:14.093822 26563 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:14.094070 26563 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.66
I20260430 02:02:14.098816 26563 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.66:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.24.153.66
--webserver_port=0
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.26563
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.66
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:14.100080 26563 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:14.101296 26563 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:14.104396 26563 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:14.108989 26572 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:14.109010 26569 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:14.108989 26570 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:14.109284 26563 server_base.cc:1061] running on GCE node
I20260430 02:02:14.109905 26563 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:14.110800 26563 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:14.112021 26563 hybrid_clock.cc:648] HybridClock initialized: now 1777514534111951 us; error 88 us; skew 500 ppm
I20260430 02:02:14.114883 26563 webserver.cc:492] Webserver started at http://127.24.153.66:32993/ using document root <none> and password file <none>
I20260430 02:02:14.115561 26563 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:14.115624 26563 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:14.115798 26563 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260430 02:02:14.117486 26563 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/instance:
uuid: "a2771da784d84201a3de0860ab987f1f"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:14.118072 26563 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal/instance:
uuid: "a2771da784d84201a3de0860ab987f1f"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:14.122028 26563 fs_manager.cc:696] Time spent creating directory manager: real 0.004s user 0.001s sys 0.003s
I20260430 02:02:14.124636 26578 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:14.126190 26563 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20260430 02:02:14.126435 26563 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
uuid: "a2771da784d84201a3de0860ab987f1f"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:14.126565 26563 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:14.139688 26563 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:14.140453 26563 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:14.140704 26563 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:14.141333 26563 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:14.142505 26563 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:02:14.142586 26563 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:14.142659 26563 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:02:14.142714 26563 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:14.153338 26563 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.66:44867
I20260430 02:02:14.153363 26691 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.66:44867 every 8 connection(s)
I20260430 02:02:14.154425 26563 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
I20260430 02:02:14.156309 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 26563
I20260430 02:02:14.156462 25189 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal/instance
I20260430 02:02:14.158985 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.67:0
--local_ip_for_outbound_sockets=127.24.153.67
--webserver_interface=127.24.153.67
--webserver_port=0
--tserver_master_addrs=127.24.153.126:41269
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:14.166862 26692 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:14.167207 26692 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:14.168680 26692 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:14.207707 26370 ts_manager.cc:194] Registered new tserver with Master: a2771da784d84201a3de0860ab987f1f (127.24.153.66:44867)
I20260430 02:02:14.208671 26370 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.66:44141
W20260430 02:02:14.284842 26695 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:14.285111 26695 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:14.285163 26695 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:14.288908 26695 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:14.289044 26695 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.67
I20260430 02:02:14.293355 26695 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.67:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.24.153.67
--webserver_port=0
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.26695
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.67
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:14.294615 26695 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:14.295739 26695 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:14.298770 26695 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:14.303474 26704 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:14.303648 26695 server_base.cc:1061] running on GCE node
W20260430 02:02:14.303485 26701 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:14.303485 26702 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:14.304361 26695 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:14.305058 26695 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:14.306728 26695 hybrid_clock.cc:648] HybridClock initialized: now 1777514534306691 us; error 82 us; skew 500 ppm
I20260430 02:02:14.308998 26695 webserver.cc:492] Webserver started at http://127.24.153.67:43087/ using document root <none> and password file <none>
I20260430 02:02:14.309743 26695 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:14.309834 26695 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:14.310245 26695 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260430 02:02:14.312002 26695 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/instance:
uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:14.312603 26695 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal/instance:
uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:14.317876 26695 fs_manager.cc:696] Time spent creating directory manager: real 0.005s user 0.006s sys 0.000s
I20260430 02:02:14.321105 26710 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:14.322620 26695 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20260430 02:02:14.322788 26695 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:14.322914 26695 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:14.338383 26695 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:14.339114 26695 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:14.339337 26695 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:14.339989 26695 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:14.341056 26695 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:02:14.341140 26695 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:14.341214 26695 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:02:14.341269 26695 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:14.354140 26695 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.67:43357
I20260430 02:02:14.354417 26823 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.67:43357 every 8 connection(s)
I20260430 02:02:14.355288 26695 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
I20260430 02:02:14.358373 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 26695
I20260430 02:02:14.358548 25189 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal/instance
I20260430 02:02:14.360771 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.68:0
--local_ip_for_outbound_sockets=127.24.153.68
--webserver_interface=127.24.153.68
--webserver_port=0
--tserver_master_addrs=127.24.153.126:41269
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:14.366457 26824 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:14.366761 26824 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:14.367452 26824 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:14.422075 26370 ts_manager.cc:194] Registered new tserver with Master: 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:14.423151 26370 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.67:38121
W20260430 02:02:14.505661 26827 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:14.506024 26827 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:14.506105 26827 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:14.510906 26827 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:14.511291 26827 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.68
I20260430 02:02:14.516587 26827 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.68:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.24.153.68
--webserver_port=0
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.26827
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.68
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:14.517845 26827 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:14.519141 26827 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:14.522323 26827 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:14.528064 26834 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:14.528045 26836 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:14.528218 26827 server_base.cc:1061] running on GCE node
W20260430 02:02:14.528045 26833 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:14.529098 26827 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:14.529801 26827 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:14.531046 26827 hybrid_clock.cc:648] HybridClock initialized: now 1777514534530990 us; error 72 us; skew 500 ppm
I20260430 02:02:14.533480 26827 webserver.cc:492] Webserver started at http://127.24.153.68:34395/ using document root <none> and password file <none>
I20260430 02:02:14.534178 26827 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:14.534264 26827 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:14.534525 26827 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260430 02:02:14.536432 26827 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/instance:
uuid: "fdcc2c1450744cb499d898b871318fa0"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:14.537045 26827 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal/instance:
uuid: "fdcc2c1450744cb499d898b871318fa0"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:14.541121 26827 fs_manager.cc:696] Time spent creating directory manager: real 0.004s user 0.006s sys 0.000s
I20260430 02:02:14.544134 26842 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:14.545257 26827 fs_manager.cc:730] Time spent opening block manager: real 0.002s user 0.002s sys 0.000s
I20260430 02:02:14.545414 26827 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
uuid: "fdcc2c1450744cb499d898b871318fa0"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:14.545555 26827 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:14.570391 26827 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:14.571156 26827 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:14.571576 26827 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:14.572260 26827 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:14.573495 26827 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:02:14.573580 26827 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:14.573652 26827 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:02:14.573702 26827 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:14.587797 26827 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.68:41611
I20260430 02:02:14.587991 26955 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.68:41611 every 8 connection(s)
I20260430 02:02:14.588919 26827 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
I20260430 02:02:14.592113 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 26827
I20260430 02:02:14.592235 25189 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal/instance
I20260430 02:02:14.600930 26956 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:14.601204 26956 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:14.601743 26956 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:14.643707 26370 ts_manager.cc:194] Registered new tserver with Master: fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611)
I20260430 02:02:14.644618 26370 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.68:60157
I20260430 02:02:14.647838 25189 external_mini_cluster.cc:949] 4 TS(s) registered with all masters
I20260430 02:02:14.673303 26370 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:43146:
name: "test-workload"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
I20260430 02:02:14.693255 26890 tablet_service.cc:1511] Processing CreateTablet for tablet 5f11b12f19254ea9a609af28f72cdaa6 (DEFAULT_TABLE table=test-workload [id=83721e80a83c49f5830017aea59e3016]), partition=RANGE (key) PARTITION UNBOUNDED
I20260430 02:02:14.693324 26494 tablet_service.cc:1511] Processing CreateTablet for tablet 5f11b12f19254ea9a609af28f72cdaa6 (DEFAULT_TABLE table=test-workload [id=83721e80a83c49f5830017aea59e3016]), partition=RANGE (key) PARTITION UNBOUNDED
I20260430 02:02:14.693324 26758 tablet_service.cc:1511] Processing CreateTablet for tablet 5f11b12f19254ea9a609af28f72cdaa6 (DEFAULT_TABLE table=test-workload [id=83721e80a83c49f5830017aea59e3016]), partition=RANGE (key) PARTITION UNBOUNDED
I20260430 02:02:14.694267 26890 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5f11b12f19254ea9a609af28f72cdaa6. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:02:14.694268 26494 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5f11b12f19254ea9a609af28f72cdaa6. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:02:14.694267 26758 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5f11b12f19254ea9a609af28f72cdaa6. 1 dirs total, 0 dirs full, 0 dirs failed
I20260430 02:02:14.699433 26979 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap starting.
I20260430 02:02:14.700253 26980 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap starting.
I20260430 02:02:14.700636 26981 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap starting.
I20260430 02:02:14.702821 26980 tablet_bootstrap.cc:654] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Neither blocks nor log segments found. Creating new log.
I20260430 02:02:14.702824 26981 tablet_bootstrap.cc:654] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Neither blocks nor log segments found. Creating new log.
I20260430 02:02:14.702824 26979 tablet_bootstrap.cc:654] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Neither blocks nor log segments found. Creating new log.
I20260430 02:02:14.704015 26980 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:14.704084 26981 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:14.704084 26979 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:14.705912 26980 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: No bootstrap required, opened a new log
I20260430 02:02:14.705996 26981 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: No bootstrap required, opened a new log
I20260430 02:02:14.705996 26979 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: No bootstrap required, opened a new log
I20260430 02:02:14.706199 26981 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Time spent bootstrapping tablet: real 0.006s user 0.005s sys 0.000s
I20260430 02:02:14.706192 26979 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Time spent bootstrapping tablet: real 0.007s user 0.005s sys 0.000s
I20260430 02:02:14.706188 26980 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Time spent bootstrapping tablet: real 0.006s user 0.004s sys 0.001s
I20260430 02:02:14.709601 26980 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:14.709646 26979 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:14.709649 26981 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:14.709867 26980 raft_consensus.cc:385] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:02:14.709865 26981 raft_consensus.cc:385] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:02:14.709951 26981 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7ab3773c508c494d8e0246ad5c98fbc0, State: Initialized, Role: FOLLOWER
I20260430 02:02:14.709951 26980 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5a60f2bd808d4e6da03625fcdb9225e8, State: Initialized, Role: FOLLOWER
I20260430 02:02:14.709865 26979 raft_consensus.cc:385] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260430 02:02:14.710016 26979 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Initialized, Role: FOLLOWER
I20260430 02:02:14.710449 26980 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:14.710451 26981 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:14.710454 26979 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:14.711525 26981 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Time spent starting tablet: real 0.005s user 0.001s sys 0.003s
I20260430 02:02:14.711525 26979 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Time spent starting tablet: real 0.005s user 0.001s sys 0.003s
I20260430 02:02:14.711536 26980 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Time spent starting tablet: real 0.005s user 0.003s sys 0.002s
I20260430 02:02:14.711705 26560 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:14.711823 26824 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:14.712451 26956 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:14.725618 26985 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:14.725812 26985 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:14.726986 26985 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:14.732306 26514 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" is_pre_election: true
I20260430 02:02:14.732620 26910 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "fdcc2c1450744cb499d898b871318fa0" is_pre_election: true
I20260430 02:02:14.732734 26514 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 5a60f2bd808d4e6da03625fcdb9225e8 in term 0.
I20260430 02:02:14.732833 26910 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 5a60f2bd808d4e6da03625fcdb9225e8 in term 0.
I20260430 02:02:14.733333 26712 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0; no voters:
I20260430 02:02:14.733623 26985 raft_consensus.cc:2804] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260430 02:02:14.733698 26985 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:02:14.733742 26985 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:02:14.734826 26985 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:14.735236 26985 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 1 election: Requested vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:14.735678 26910 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "fdcc2c1450744cb499d898b871318fa0"
I20260430 02:02:14.735826 26910 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:02:14.735945 26514 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
I20260430 02:02:14.736102 26514 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 0 FOLLOWER]: Advancing to term 1
I20260430 02:02:14.736991 26910 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 5a60f2bd808d4e6da03625fcdb9225e8 in term 1.
I20260430 02:02:14.737287 26514 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 5a60f2bd808d4e6da03625fcdb9225e8 in term 1.
I20260430 02:02:14.737464 26712 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0; no voters:
I20260430 02:02:14.737756 26985 raft_consensus.cc:2804] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 1 FOLLOWER]: Leader election won for term 1
I20260430 02:02:14.738083 26985 raft_consensus.cc:697] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 1 LEADER]: Becoming Leader. State: Replica: 5a60f2bd808d4e6da03625fcdb9225e8, State: Running, Role: LEADER
I20260430 02:02:14.738404 26985 consensus_queue.cc:237] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:14.742153 26370 catalog_manager.cc:5671] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 reported cstate change: term changed from 0 to 1, leader changed from <none> to 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67). New cstate: current_term: 1 leader_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } health_report { overall_health: HEALTHY } } }
I20260430 02:02:14.749531 25189 maintenance_mode-itest.cc:745] Restarting batch of 4 tservers: a2771da784d84201a3de0860ab987f1f,fdcc2c1450744cb499d898b871318fa0,7ab3773c508c494d8e0246ad5c98fbc0,5a60f2bd808d4e6da03625fcdb9225e8
I20260430 02:02:14.824427 26985 consensus_queue.cc:1048] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260430 02:02:14.831063 26910 raft_consensus.cc:1275] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 1 FOLLOWER]: Refusing update from remote peer 5a60f2bd808d4e6da03625fcdb9225e8: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260430 02:02:14.832005 26985 consensus_queue.cc:1048] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [LEADER]: Connected to new peer: Peer: permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20260430 02:02:14.839864 26957 tablet.cc:2404] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260430 02:02:14.859822 27008 mvcc.cc:204] Tried to move back new op lower bound from 7280699534667497472 to 7280699534291640320. Current Snapshot: MvccSnapshot[applied={T|T < 7280699534659301376}]
I20260430 02:02:15.013121 26369 ts_manager.cc:295] Set tserver state for fdcc2c1450744cb499d898b871318fa0 to MAINTENANCE_MODE
I20260430 02:02:15.107668 26369 ts_manager.cc:295] Set tserver state for 7ab3773c508c494d8e0246ad5c98fbc0 to MAINTENANCE_MODE
I20260430 02:02:15.151732 26369 ts_manager.cc:295] Set tserver state for 5a60f2bd808d4e6da03625fcdb9225e8 to MAINTENANCE_MODE
I20260430 02:02:15.177199 26369 ts_manager.cc:295] Set tserver state for a2771da784d84201a3de0860ab987f1f to MAINTENANCE_MODE
I20260430 02:02:15.211922 26692 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:15.306715 26890 tablet_service.cc:1460] Tablet server fdcc2c1450744cb499d898b871318fa0 set to quiescing
I20260430 02:02:15.307137 26890 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:15.512097 26758 tablet_service.cc:1460] Tablet server 5a60f2bd808d4e6da03625fcdb9225e8 set to quiescing
I20260430 02:02:15.512305 26758 tablet_service.cc:1467] Tablet server has 1 leaders and 1 scanners
I20260430 02:02:15.526960 26989 raft_consensus.cc:993] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: : Instructing follower 7ab3773c508c494d8e0246ad5c98fbc0 to start an election
I20260430 02:02:15.527098 26989 raft_consensus.cc:1081] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 1 LEADER]: Signalling peer 7ab3773c508c494d8e0246ad5c98fbc0 to start an election
I20260430 02:02:15.529982 26514 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6"
dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
from {username='slave'} at 127.24.153.67:38145
I20260430 02:02:15.530238 26514 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 1 FOLLOWER]: Starting forced leader election (received explicit request)
I20260430 02:02:15.530326 26514 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 1 FOLLOWER]: Advancing to term 2
I20260430 02:02:15.531406 26514 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 2 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:15.532748 26514 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 2 election: Requested vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:15.540807 26514 raft_consensus.cc:1240] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 2 FOLLOWER]: Rejecting Update request from peer 5a60f2bd808d4e6da03625fcdb9225e8 for earlier term 1. Current term is 2. Ops: [1.59-1.59]
I20260430 02:02:15.543231 27020 consensus_queue.cc:1059] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 }, Status: INVALID_TERM, Last received: 1.58, Next index: 59, Last known committed idx: 58, Time since last communication: 0.000s
I20260430 02:02:15.543618 27020 raft_consensus.cc:3055] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 1 LEADER]: Stepping down as leader of term 1
I20260430 02:02:15.543675 27020 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 1 LEADER]: Becoming Follower/Learner. State: Replica: 5a60f2bd808d4e6da03625fcdb9225e8, State: Running, Role: LEADER
I20260430 02:02:15.543834 27020 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 59, Committed index: 59, Last appended: 1.61, Last appended by leader: 61, Current term: 1, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:15.543958 27020 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 1 FOLLOWER]: Advancing to term 2
I20260430 02:02:15.544950 26778 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 2 candidate_status { last_received { term: 1 index: 58 } } ignore_live_leader: true dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
I20260430 02:02:15.545181 26778 raft_consensus.cc:2410] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate 7ab3773c508c494d8e0246ad5c98fbc0 for term 2 because replica has last-logged OpId of term: 1 index: 61, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 58.
I20260430 02:02:15.549759 26909 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 2 candidate_status { last_received { term: 1 index: 58 } } ignore_live_leader: true dest_uuid: "fdcc2c1450744cb499d898b871318fa0"
I20260430 02:02:15.551357 26448 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0
W20260430 02:02:15.553882 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.577051 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
I20260430 02:02:15.578766 27104 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
I20260430 02:02:15.581523 26494 tablet_service.cc:1460] Tablet server 7ab3773c508c494d8e0246ad5c98fbc0 set to quiescing
I20260430 02:02:15.581712 26494 tablet_service.cc:1467] Tablet server has 0 leaders and 1 scanners
W20260430 02:02:15.582592 26472 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.591075 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:15.597353 26626 tablet_service.cc:1460] Tablet server a2771da784d84201a3de0860ab987f1f set to quiescing
I20260430 02:02:15.597581 26626 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260430 02:02:15.599490 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.609303 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.621690 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.633710 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.645653 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.659279 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.673341 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.688335 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.705705 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.720662 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.740465 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.762413 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.782536 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.801651 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.808107 27020 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: failed to trigger leader election: Illegal state: leader elections are disabled
W20260430 02:02:15.826503 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.847615 27112 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: failed to trigger leader election: Illegal state: leader elections are disabled
W20260430 02:02:15.848984 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.871248 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.898612 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.922869 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.948033 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:15.979658 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.006642 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.016162 27104 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: failed to trigger leader election: Illegal state: leader elections are disabled
W20260430 02:02:16.037317 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.071730 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.105870 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.140180 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.174947 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.211937 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.248922 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.287276 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.325508 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.365720 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.406193 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.447093 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.491134 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.533583 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.579078 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.623749 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.672544 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.720558 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
I20260430 02:02:16.725687 26758 tablet_service.cc:1460] Tablet server 5a60f2bd808d4e6da03625fcdb9225e8 set to quiescing
I20260430 02:02:16.725884 26758 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260430 02:02:16.769769 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:16.804564 26494 tablet_service.cc:1460] Tablet server 7ab3773c508c494d8e0246ad5c98fbc0 set to quiescing
I20260430 02:02:16.804801 26494 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:16.815686 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 26563
W20260430 02:02:16.820868 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:16.824103 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.66:44867
--local_ip_for_outbound_sockets=127.24.153.66
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=32993
--webserver_interface=127.24.153.66
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260430 02:02:16.870048 26870 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48238: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.921072 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:16.943162 27135 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:16.943416 27135 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:16.943465 27135 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:16.947671 27135 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:16.947849 27135 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.66
I20260430 02:02:16.954051 27135 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.66:44867
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.24.153.66
--webserver_port=32993
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.27135
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.66
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:16.955557 27135 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:16.957064 27135 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:16.960438 27135 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:16.964813 27140 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:16.964843 27143 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:16.964916 27135 server_base.cc:1061] running on GCE node
W20260430 02:02:16.964814 27141 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:16.965852 27135 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:16.966583 27135 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:16.967777 27135 hybrid_clock.cc:648] HybridClock initialized: now 1777514536967754 us; error 38 us; skew 500 ppm
I20260430 02:02:16.970101 27135 webserver.cc:492] Webserver started at http://127.24.153.66:32993/ using document root <none> and password file <none>
I20260430 02:02:16.970794 27135 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:16.970901 27135 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:16.974856 27135 fs_manager.cc:714] Time spent opening directory manager: real 0.003s user 0.000s sys 0.004s
W20260430 02:02:16.976722 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:16.977630 27149 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:16.978982 27135 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.002s sys 0.001s
I20260430 02:02:16.979151 27135 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
uuid: "a2771da784d84201a3de0860ab987f1f"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:16.979640 27135 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:16.992395 27135 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:16.993153 27135 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:16.993383 27135 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:16.994220 27135 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:16.995350 27135 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:02:16.995430 27135 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:16.995507 27135 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:02:16.995556 27135 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:17.005403 27135 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.66:44867
I20260430 02:02:17.005434 27262 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.66:44867 every 8 connection(s)
I20260430 02:02:17.006603 27135 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
I20260430 02:02:17.011528 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 27135
I20260430 02:02:17.011700 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 26827
I20260430 02:02:17.017498 27263 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:17.017824 27263 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:17.018647 27263 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:17.019711 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.68:41611
--local_ip_for_outbound_sockets=127.24.153.68
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=34395
--webserver_interface=127.24.153.68
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:17.020164 26371 ts_manager.cc:194] Re-registered known tserver with Master: a2771da784d84201a3de0860ab987f1f (127.24.153.66:44867)
I20260430 02:02:17.020804 26371 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.66:49493
W20260430 02:02:17.031577 26966 meta_cache.cc:302] tablet 5f11b12f19254ea9a609af28f72cdaa6: replica fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611) has failed: Network error: Client connection negotiation failed: client connection to 127.24.153.68:41611: connect: Connection refused (error 111)
W20260430 02:02:17.086259 26474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46676: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:17.135056 27267 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:17.135331 27267 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:17.135385 27267 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:17.139592 27267 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:17.139778 27267 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.68
W20260430 02:02:17.140601 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:17.144181 27267 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.68:41611
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.24.153.68
--webserver_port=34395
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.27267
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.68
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:17.145365 27267 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:17.146550 27267 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:17.149391 27267 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
I20260430 02:02:17.154098 27267 server_base.cc:1061] running on GCE node
W20260430 02:02:17.154039 27274 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:17.154039 27273 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:17.154039 27276 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:17.154845 27267 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:17.155587 27267 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:17.156785 27267 hybrid_clock.cc:648] HybridClock initialized: now 1777514537156752 us; error 51 us; skew 500 ppm
I20260430 02:02:17.159581 27267 webserver.cc:492] Webserver started at http://127.24.153.68:34395/ using document root <none> and password file <none>
I20260430 02:02:17.160336 27267 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:17.160449 27267 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:17.164297 27267 fs_manager.cc:714] Time spent opening directory manager: real 0.003s user 0.000s sys 0.004s
I20260430 02:02:17.166615 27282 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:17.167863 27267 fs_manager.cc:730] Time spent opening block manager: real 0.002s user 0.003s sys 0.000s
I20260430 02:02:17.168011 27267 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
uuid: "fdcc2c1450744cb499d898b871318fa0"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:17.168471 27267 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:17.201608 27267 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:17.202389 27267 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:17.202545 27267 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:17.203045 27267 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:17.204384 27289 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:17.207803 27267 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:17.207877 27267 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.004s user 0.000s sys 0.000s
I20260430 02:02:17.207980 27267 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260430 02:02:17.209685 27267 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:17.209824 27267 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s user 0.002s sys 0.000s
I20260430 02:02:17.210155 27289 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap starting.
I20260430 02:02:17.221972 27267 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.68:41611
I20260430 02:02:17.221988 27396 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.68:41611 every 8 connection(s)
I20260430 02:02:17.223089 27267 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
I20260430 02:02:17.227347 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 27267
I20260430 02:02:17.227610 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 26432
I20260430 02:02:17.236344 27289 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:17.236367 27397 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:17.236717 27397 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:17.237396 27397 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:17.237807 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.65:45451
--local_ip_for_outbound_sockets=127.24.153.65
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=41493
--webserver_interface=127.24.153.65
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:17.239611 26371 ts_manager.cc:194] Re-registered known tserver with Master: fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611)
I20260430 02:02:17.240576 26371 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.68:38295
W20260430 02:02:17.309818 26734 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45210: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:17.379122 27289 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap replayed 1/1 log segments. Stats: ops{read=61 overwritten=0 applied=59 ignored=0} inserts{seen=2900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260430 02:02:17.379751 27289 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap complete.
I20260430 02:02:17.381461 27289 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Time spent bootstrapping tablet: real 0.172s user 0.146s sys 0.019s
I20260430 02:02:17.384663 27289 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 1 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:17.387037 27289 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Initialized, Role: FOLLOWER
I20260430 02:02:17.387662 27289 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 59, Last appended: 1.61, Last appended by leader: 61, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:17.388363 27397 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:17.389009 27289 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Time spent starting tablet: real 0.007s user 0.010s sys 0.000s
W20260430 02:02:17.397305 27401 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:17.397565 27401 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:17.397620 27401 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:17.401681 27401 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:17.401849 27401 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.65
I20260430 02:02:17.406486 27401 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.65:45451
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.24.153.65
--webserver_port=41493
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.27401
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.65
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:17.407722 27401 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:17.408926 27401 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:17.411928 27401 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:17.416741 27410 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:17.416831 27409 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:17.416757 27412 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:17.417063 27401 server_base.cc:1061] running on GCE node
I20260430 02:02:17.417544 27401 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:17.418255 27401 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:17.419464 27401 hybrid_clock.cc:648] HybridClock initialized: now 1777514537419414 us; error 64 us; skew 500 ppm
I20260430 02:02:17.421563 27401 webserver.cc:492] Webserver started at http://127.24.153.65:41493/ using document root <none> and password file <none>
I20260430 02:02:17.422246 27401 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:17.422327 27401 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:17.426019 27401 fs_manager.cc:714] Time spent opening directory manager: real 0.002s user 0.004s sys 0.000s
I20260430 02:02:17.428921 27418 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:17.430294 27401 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20260430 02:02:17.430423 27401 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
format_stamp: "Formatted at 2026-04-30 02:02:13 on dist-test-slave-f7mg"
I20260430 02:02:17.430912 27401 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:17.446074 27401 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:17.446822 27401 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:17.447033 27401 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:17.447750 27401 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:17.449259 27425 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:17.453433 27401 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:17.453599 27401 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.005s user 0.001s sys 0.000s
I20260430 02:02:17.453742 27401 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260430 02:02:17.455963 27401 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:17.456040 27401 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s user 0.001s sys 0.000s
I20260430 02:02:17.456544 27425 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap starting.
I20260430 02:02:17.469990 27401 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.65:45451
I20260430 02:02:17.470046 27532 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.65:45451 every 8 connection(s)
I20260430 02:02:17.471109 27401 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
I20260430 02:02:17.477206 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 27401
I20260430 02:02:17.477725 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 26695
I20260430 02:02:17.480288 27425 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:17.486766 27533 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:17.487073 27533 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:17.487947 27533 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:17.488214 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.67:43357
--local_ip_for_outbound_sockets=127.24.153.67
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=43087
--webserver_interface=127.24.153.67
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:17.490226 26371 ts_manager.cc:194] Re-registered known tserver with Master: 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:17.491369 26371 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.65:59243
W20260430 02:02:17.563371 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
I20260430 02:02:17.579370 27425 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap replayed 1/1 log segments. Stats: ops{read=58 overwritten=0 applied=58 ignored=0} inserts{seen=2850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260430 02:02:17.579800 27425 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap complete.
I20260430 02:02:17.580914 27425 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Time spent bootstrapping tablet: real 0.125s user 0.107s sys 0.012s
I20260430 02:02:17.583076 27425 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 2 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:17.583456 27425 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7ab3773c508c494d8e0246ad5c98fbc0, State: Initialized, Role: FOLLOWER
I20260430 02:02:17.583902 27425 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 58, Last appended: 1.58, Last appended by leader: 58, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:17.584697 27533 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:17.584853 27425 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Time spent starting tablet: real 0.004s user 0.006s sys 0.000s
W20260430 02:02:17.606307 27537 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:17.606628 27537 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:17.606740 27537 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:17.610934 27537 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:17.611163 27537 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.67
I20260430 02:02:17.615757 27537 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.67:43357
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.24.153.67
--webserver_port=43087
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.27537
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.67
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:17.616950 27537 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:17.618229 27537 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:17.622742 27537 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:17.629735 27547 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:17.629743 27544 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:17.630203 27545 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:17.630352 27537 server_base.cc:1061] running on GCE node
I20260430 02:02:17.631443 27537 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:17.632133 27537 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
W20260430 02:02:17.632926 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:17.633399 27537 hybrid_clock.cc:648] HybridClock initialized: now 1777514537633371 us; error 47 us; skew 500 ppm
I20260430 02:02:17.635478 27537 webserver.cc:492] Webserver started at http://127.24.153.67:43087/ using document root <none> and password file <none>
I20260430 02:02:17.636087 27537 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:17.636158 27537 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:17.640100 27537 fs_manager.cc:714] Time spent opening directory manager: real 0.002s user 0.001s sys 0.003s
I20260430 02:02:17.642709 27553 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:17.643950 27537 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20260430 02:02:17.644073 27537 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:17.644562 27537 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:17.672719 27537 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:17.673736 27537 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:17.673969 27537 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:17.674723 27537 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:17.676625 27560 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:17.681313 27537 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:17.681437 27537 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.005s user 0.001s sys 0.000s
I20260430 02:02:17.681529 27537 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260430 02:02:17.684082 27537 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:17.684293 27537 ts_tablet_manager.cc:595] Time spent register tablets: real 0.003s user 0.003s sys 0.000s
I20260430 02:02:17.684525 27560 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap starting.
I20260430 02:02:17.697100 27537 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.67:43357
I20260430 02:02:17.697160 27667 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.67:43357 every 8 connection(s)
I20260430 02:02:17.698454 27537 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
I20260430 02:02:17.698992 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 27537
I20260430 02:02:17.712049 27668 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:17.712414 27668 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:17.713255 27668 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:17.715973 26371 ts_manager.cc:194] Re-registered known tserver with Master: 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:17.717187 26371 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.67:52657
I20260430 02:02:17.722851 27560 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:17.741531 27404 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 1 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:17.741741 27404 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:17.743031 27404 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:17.748914 27487 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 2 candidate_status { last_received { term: 1 index: 61 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" is_pre_election: true
I20260430 02:02:17.749279 27487 raft_consensus.cc:2393] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 2 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate fdcc2c1450744cb499d898b871318fa0 in current term 2: Already voted for candidate 7ab3773c508c494d8e0246ad5c98fbc0 in this term.
I20260430 02:02:17.748975 27622 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 2 candidate_status { last_received { term: 1 index: 61 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
W20260430 02:02:17.750792 27286 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 2 pre-election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:17.750938 27286 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: fdcc2c1450744cb499d898b871318fa0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, 7ab3773c508c494d8e0246ad5c98fbc0
I20260430 02:02:17.751206 27404 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 1 FOLLOWER]: Advancing to term 2
I20260430 02:02:17.753640 27404 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 2 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
W20260430 02:02:17.797447 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
I20260430 02:02:17.828588 27560 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap replayed 1/1 log segments. Stats: ops{read=61 overwritten=0 applied=59 ignored=0} inserts{seen=2900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260430 02:02:17.829084 27560 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap complete.
I20260430 02:02:17.830273 27560 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Time spent bootstrapping tablet: real 0.146s user 0.128s sys 0.008s
I20260430 02:02:17.831831 27560 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 2 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:17.833730 27560 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5a60f2bd808d4e6da03625fcdb9225e8, State: Initialized, Role: FOLLOWER
I20260430 02:02:17.834444 27560 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 59, Last appended: 1.61, Last appended by leader: 61, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:17.835156 27668 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:17.835395 27560 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Time spent starting tablet: real 0.005s user 0.004s sys 0.000s
I20260430 02:02:17.844471 27538 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:17.844699 27538 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:17.846050 27538 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:17.850472 27622 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 3 candidate_status { last_received { term: 1 index: 58 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
I20260430 02:02:17.850761 27622 raft_consensus.cc:2410] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 2 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 7ab3773c508c494d8e0246ad5c98fbc0 for term 3 because replica has last-logged OpId of term: 1 index: 61, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 58.
I20260430 02:02:17.851209 27351 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 3 candidate_status { last_received { term: 1 index: 58 } } ignore_live_leader: false dest_uuid: "fdcc2c1450744cb499d898b871318fa0" is_pre_election: true
I20260430 02:02:17.851465 27351 raft_consensus.cc:2410] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 2 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 7ab3773c508c494d8e0246ad5c98fbc0 for term 3 because replica has last-logged OpId of term: 1 index: 61, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 58.
I20260430 02:02:17.851907 27420 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0
I20260430 02:02:17.852195 27538 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
W20260430 02:02:17.865537 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:17.934049 27572 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:17.953044 27331 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:17.953171 27602 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:17.953306 27197 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:17.953871 27467 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260430 02:02:18.002254 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
I20260430 02:02:18.023017 27263 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
W20260430 02:02:18.028303 26995 scanner-internal.cc:458] Time spent opening tablet: real 2.421s user 0.005s sys 0.000s
W20260430 02:02:18.071308 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:18.104054 27404 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:18.104230 27404 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:18.104745 27404 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:18.105365 27622 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 3 candidate_status { last_received { term: 1 index: 61 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
I20260430 02:02:18.105384 27487 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 3 candidate_status { last_received { term: 1 index: 61 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" is_pre_election: true
I20260430 02:02:18.105572 27487 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 2.
I20260430 02:02:18.105610 27622 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 2.
I20260430 02:02:18.105937 27284 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0, fdcc2c1450744cb499d898b871318fa0; no voters:
I20260430 02:02:18.106117 27404 raft_consensus.cc:2804] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 2 FOLLOWER]: Leader pre-election won for term 3
I20260430 02:02:18.106191 27404 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 2 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:02:18.106233 27404 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 2 FOLLOWER]: Advancing to term 3
I20260430 02:02:18.107360 27404 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 3 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:18.107769 27404 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 3 election: Requested vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:18.108287 27622 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 3 candidate_status { last_received { term: 1 index: 61 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
I20260430 02:02:18.108330 27487 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 3 candidate_status { last_received { term: 1 index: 61 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
I20260430 02:02:18.108465 27487 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 2 FOLLOWER]: Advancing to term 3
I20260430 02:02:18.108465 27622 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 2 FOLLOWER]: Advancing to term 3
I20260430 02:02:18.110396 27622 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 3.
I20260430 02:02:18.110462 27487 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 3.
I20260430 02:02:18.110766 27286 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0; no voters:
I20260430 02:02:18.111002 27404 raft_consensus.cc:2804] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 3 FOLLOWER]: Leader election won for term 3
I20260430 02:02:18.111181 27404 raft_consensus.cc:697] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 3 LEADER]: Becoming Leader. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Running, Role: LEADER
I20260430 02:02:18.111524 27404 consensus_queue.cc:237] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 59, Committed index: 59, Last appended: 1.61, Last appended by leader: 61, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:18.114789 26371 catalog_manager.cc:5671] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 reported cstate change: term changed from 1 to 3, leader changed from 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67) to fdcc2c1450744cb499d898b871318fa0 (127.24.153.68). New cstate: current_term: 3 leader_uuid: "fdcc2c1450744cb499d898b871318fa0" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } health_report { overall_health: UNKNOWN } } }
W20260430 02:02:18.138782 26994 scanner-internal.cc:458] Time spent opening tablet: real 2.415s user 0.001s sys 0.003s
I20260430 02:02:18.146785 27622 raft_consensus.cc:1275] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 3 FOLLOWER]: Refusing update from remote peer fdcc2c1450744cb499d898b871318fa0: Log matching property violated. Preceding OpId in replica: term: 1 index: 61. Preceding OpId from leader: term: 3 index: 63. (index mismatch)
I20260430 02:02:18.147029 27487 raft_consensus.cc:1275] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 3 FOLLOWER]: Refusing update from remote peer fdcc2c1450744cb499d898b871318fa0: Log matching property violated. Preceding OpId in replica: term: 1 index: 58. Preceding OpId from leader: term: 3 index: 63. (index mismatch)
I20260430 02:02:18.147604 27404 consensus_queue.cc:1048] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 62, Last known committed idx: 59, Time since last communication: 0.000s
I20260430 02:02:18.147879 27709 consensus_queue.cc:1048] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 62, Last known committed idx: 58, Time since last communication: 0.000s
W20260430 02:02:18.163616 26996 scanner-internal.cc:458] Time spent opening tablet: real 2.418s user 0.004s sys 0.000s
I20260430 02:02:23.402432 27467 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:23.414985 27602 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260430 02:02:23.448218 27197 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:23.508747 27331 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260430 02:02:23.956007 26369 ts_manager.cc:284] Unset tserver state for 5a60f2bd808d4e6da03625fcdb9225e8 from MAINTENANCE_MODE
I20260430 02:02:24.035269 27263 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:24.106915 26369 ts_manager.cc:284] Unset tserver state for fdcc2c1450744cb499d898b871318fa0 from MAINTENANCE_MODE
I20260430 02:02:24.157298 26369 ts_manager.cc:284] Unset tserver state for a2771da784d84201a3de0860ab987f1f from MAINTENANCE_MODE
I20260430 02:02:24.171377 27668 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:24.196574 27397 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:24.207859 27533 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:24.242813 26369 ts_manager.cc:284] Unset tserver state for 7ab3773c508c494d8e0246ad5c98fbc0 from MAINTENANCE_MODE
I20260430 02:02:24.936623 26369 ts_manager.cc:295] Set tserver state for 7ab3773c508c494d8e0246ad5c98fbc0 to MAINTENANCE_MODE
I20260430 02:02:24.949856 26371 ts_manager.cc:295] Set tserver state for fdcc2c1450744cb499d898b871318fa0 to MAINTENANCE_MODE
I20260430 02:02:24.958069 26371 ts_manager.cc:295] Set tserver state for a2771da784d84201a3de0860ab987f1f to MAINTENANCE_MODE
I20260430 02:02:25.038954 27263 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:25.114562 26371 ts_manager.cc:295] Set tserver state for 5a60f2bd808d4e6da03625fcdb9225e8 to MAINTENANCE_MODE
I20260430 02:02:25.174882 27668 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:25.203825 27397 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:25.210934 27533 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:25.538599 27197 tablet_service.cc:1460] Tablet server a2771da784d84201a3de0860ab987f1f set to quiescing
I20260430 02:02:25.538784 27197 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:25.600714 27467 tablet_service.cc:1460] Tablet server 7ab3773c508c494d8e0246ad5c98fbc0 set to quiescing
I20260430 02:02:25.601009 27467 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:25.634778 27331 tablet_service.cc:1460] Tablet server fdcc2c1450744cb499d898b871318fa0 set to quiescing
I20260430 02:02:25.634945 27331 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260430 02:02:25.644073 27731 raft_consensus.cc:993] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: : Instructing follower 5a60f2bd808d4e6da03625fcdb9225e8 to start an election
I20260430 02:02:25.644218 27731 raft_consensus.cc:1081] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 3 LEADER]: Signalling peer 5a60f2bd808d4e6da03625fcdb9225e8 to start an election
I20260430 02:02:25.647472 27620 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6"
dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
from {username='slave'} at 127.24.153.68:50369
I20260430 02:02:25.647661 27620 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 3 FOLLOWER]: Starting forced leader election (received explicit request)
I20260430 02:02:25.647730 27620 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 3 FOLLOWER]: Advancing to term 4
I20260430 02:02:25.649041 27620 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 4 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:25.649377 27602 tablet_service.cc:1460] Tablet server 5a60f2bd808d4e6da03625fcdb9225e8 set to quiescing
I20260430 02:02:25.649502 27602 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260430 02:02:25.650256 27620 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 4 election: Requested vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:25.651857 27882 raft_consensus.cc:993] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: : Instructing follower 5a60f2bd808d4e6da03625fcdb9225e8 to start an election
I20260430 02:02:25.651957 27882 raft_consensus.cc:1081] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 3 LEADER]: Signalling peer 5a60f2bd808d4e6da03625fcdb9225e8 to start an election
I20260430 02:02:25.653983 27620 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6"
dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
from {username='slave'} at 127.24.153.68:50369
W20260430 02:02:25.654597 27286 consensus_peers.cc:409] unable to start election on peer 5a60f2bd808d4e6da03625fcdb9225e8: Illegal state: leader elections are disabled
I20260430 02:02:25.656818 27620 raft_consensus.cc:1240] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 4 FOLLOWER]: Rejecting Update request from peer fdcc2c1450744cb499d898b871318fa0 for earlier term 3. Current term is 4. Ops: [3.1181-3.1181]
I20260430 02:02:25.657395 27731 consensus_queue.cc:1059] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 }, Status: INVALID_TERM, Last received: 3.1180, Next index: 1181, Last known committed idx: 1180, Time since last communication: 0.000s
I20260430 02:02:25.657604 27731 raft_consensus.cc:3055] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 3 LEADER]: Stepping down as leader of term 3
I20260430 02:02:25.657653 27731 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 3 LEADER]: Becoming Follower/Learner. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Running, Role: LEADER
I20260430 02:02:25.657810 27731 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 1180, Committed index: 1180, Last appended: 3.1181, Last appended by leader: 1181, Current term: 3, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:25.658125 27731 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 3 FOLLOWER]: Advancing to term 4
W20260430 02:02:25.658706 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.659258 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.662715 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.664808 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:25.666410 27351 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 4 candidate_status { last_received { term: 3 index: 1180 } } ignore_live_leader: true dest_uuid: "fdcc2c1450744cb499d898b871318fa0"
I20260430 02:02:25.666594 27351 raft_consensus.cc:2410] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 4 FOLLOWER]: Leader election vote request: Denying vote to candidate 5a60f2bd808d4e6da03625fcdb9225e8 for term 4 because replica has last-logged OpId of term: 3 index: 1181, which is greater than that of the candidate, which has last-logged OpId of term: 3 index: 1180.
I20260430 02:02:25.666249 27487 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 4 candidate_status { last_received { term: 3 index: 1180 } } ignore_live_leader: true dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
I20260430 02:02:25.666922 27487 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 3 FOLLOWER]: Advancing to term 4
I20260430 02:02:25.668267 27487 raft_consensus.cc:2410] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 4 FOLLOWER]: Leader election vote request: Denying vote to candidate 5a60f2bd808d4e6da03625fcdb9225e8 for term 4 because replica has last-logged OpId of term: 3 index: 1181, which is greater than that of the candidate, which has last-logged OpId of term: 3 index: 1180.
W20260430 02:02:25.671891 27566 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.672354 27567 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:25.673084 27555 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 4 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 5a60f2bd808d4e6da03625fcdb9225e8; no voters: 7ab3773c508c494d8e0246ad5c98fbc0, fdcc2c1450744cb499d898b871318fa0
I20260430 02:02:25.673578 27927 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 4 FOLLOWER]: Leader election lost for term 4. Reason: could not achieve majority
W20260430 02:02:25.680131 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.680467 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.685675 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.687862 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.695154 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.696460 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.704053 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.704097 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.712749 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.713970 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.723932 27571 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.725512 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.736934 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.739156 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.749588 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.751874 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.763468 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.764449 27566 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.781080 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.782493 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.795861 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.796834 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.814523 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.815351 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.834403 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.836761 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.853092 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.854136 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.874614 27571 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.875782 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.886801 27742 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: failed to trigger leader election: Illegal state: leader elections are disabled
W20260430 02:02:25.897408 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.899396 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.918104 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.923621 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.941202 27566 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.946434 27571 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.959614 27928 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: failed to trigger leader election: Illegal state: leader elections are disabled
W20260430 02:02:25.969278 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.971668 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.996299 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:25.996301 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.013593 27927 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: failed to trigger leader election: Illegal state: leader elections are disabled
W20260430 02:02:26.023190 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.025223 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.051331 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.052050 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.079823 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.082198 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.110453 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.114447 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.141465 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.145539 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.175959 27447 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.175959 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.206867 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.207671 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.242250 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.242362 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.275053 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.279647 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.312628 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.316789 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.352028 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.355952 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.388631 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.392066 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.429096 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.431895 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.470134 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.474457 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.509415 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.514127 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.550875 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.558068 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.593896 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.603250 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.637073 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.649473 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.683722 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.693073 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.734129 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.742838 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.780798 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.787617 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.831286 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:26.832418 27331 tablet_service.cc:1460] Tablet server fdcc2c1450744cb499d898b871318fa0 set to quiescing
I20260430 02:02:26.832599 27331 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260430 02:02:26.833418 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:26.850337 27602 tablet_service.cc:1460] Tablet server 5a60f2bd808d4e6da03625fcdb9225e8 set to quiescing
I20260430 02:02:26.850535 27602 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260430 02:02:26.880703 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.883376 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
I20260430 02:02:26.909807 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 27135
I20260430 02:02:26.917060 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.66:44867
--local_ip_for_outbound_sockets=127.24.153.66
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=32993
--webserver_interface=127.24.153.66
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260430 02:02:26.932204 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.933264 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.983182 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:26.985150 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:27.033387 27951 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:27.033660 27951 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:27.033746 27951 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:27.037880 27309 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:27.038056 27310 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48260: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:27.038223 27951 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:27.038378 27951 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.66
I20260430 02:02:27.042709 27951 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.66:44867
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.24.153.66
--webserver_port=32993
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.27951
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.66
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:27.043902 27951 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:27.045122 27951 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:27.048363 27951 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:27.053552 27957 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:27.053580 27956 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:27.053660 27959 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:27.053859 27951 server_base.cc:1061] running on GCE node
I20260430 02:02:27.054405 27951 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:27.055025 27951 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:27.056210 27951 hybrid_clock.cc:648] HybridClock initialized: now 1777514547056189 us; error 46 us; skew 500 ppm
I20260430 02:02:27.058470 27951 webserver.cc:492] Webserver started at http://127.24.153.66:32993/ using document root <none> and password file <none>
I20260430 02:02:27.059119 27951 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:27.059214 27951 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:27.063124 27951 fs_manager.cc:714] Time spent opening directory manager: real 0.002s user 0.001s sys 0.003s
I20260430 02:02:27.065410 27965 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:27.066656 27951 fs_manager.cc:730] Time spent opening block manager: real 0.002s user 0.002s sys 0.001s
I20260430 02:02:27.066780 27951 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
uuid: "a2771da784d84201a3de0860ab987f1f"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:27.067212 27951 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260430 02:02:27.089061 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:27.089902 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:27.093144 27951 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:27.093858 27951 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:27.094116 27951 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:27.094714 27951 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:27.095739 27951 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:02:27.095788 27951 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:27.095862 27951 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:02:27.095893 27951 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:27.106007 27951 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.66:44867
I20260430 02:02:27.106026 28078 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.66:44867 every 8 connection(s)
I20260430 02:02:27.107152 27951 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
I20260430 02:02:27.114791 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 27951
I20260430 02:02:27.114952 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 27267
I20260430 02:02:27.117898 28079 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:27.118222 28079 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:27.118806 28079 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:27.119947 26371 ts_manager.cc:194] Re-registered known tserver with Master: a2771da784d84201a3de0860ab987f1f (127.24.153.66:44867)
I20260430 02:02:27.120541 26371 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.66:44013
W20260430 02:02:27.122046 26966 meta_cache.cc:302] tablet 5f11b12f19254ea9a609af28f72cdaa6: replica fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611) has failed: Network error: recv got EOF from 127.24.153.68:41611 (error 108)
I20260430 02:02:27.122555 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.68:41611
--local_ip_for_outbound_sockets=127.24.153.68
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=34395
--webserver_interface=127.24.153.68
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260430 02:02:27.142565 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:27.143255 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:27.234851 28083 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:27.235107 28083 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:27.235162 28083 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:27.239041 28083 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:27.239173 28083 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.68
I20260430 02:02:27.243305 28083 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.68:41611
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.24.153.68
--webserver_port=34395
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.28083
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.68
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:27.244407 28083 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:27.245509 28083 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:27.248399 28083 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:27.251675 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:27.252568 28092 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:27.252568 28090 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:27.252568 28089 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:27.253561 28083 server_base.cc:1061] running on GCE node
I20260430 02:02:27.254104 28083 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:27.254807 28083 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
W20260430 02:02:27.255373 27446 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46732: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:27.256067 28083 hybrid_clock.cc:648] HybridClock initialized: now 1777514547256050 us; error 34 us; skew 500 ppm
I20260430 02:02:27.258237 28083 webserver.cc:492] Webserver started at http://127.24.153.68:34395/ using document root <none> and password file <none>
I20260430 02:02:27.258891 28083 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:27.258951 28083 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:27.262861 28083 fs_manager.cc:714] Time spent opening directory manager: real 0.003s user 0.004s sys 0.000s
I20260430 02:02:27.265403 28098 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:27.266700 28083 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.002s sys 0.000s
I20260430 02:02:27.266861 28083 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
uuid: "fdcc2c1450744cb499d898b871318fa0"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:27.267320 28083 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:27.285121 28083 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:27.285874 28083 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:27.286130 28083 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:27.286754 28083 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:27.287946 28105 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:27.291290 28083 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:27.291370 28083 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.004s user 0.000s sys 0.000s
I20260430 02:02:27.291554 28083 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260430 02:02:27.293362 28083 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:27.293423 28083 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s user 0.000s sys 0.000s
I20260430 02:02:27.293861 28105 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap starting.
I20260430 02:02:27.305207 28083 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.68:41611
I20260430 02:02:27.305298 28212 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.68:41611 every 8 connection(s)
I20260430 02:02:27.306344 28083 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
I20260430 02:02:27.309451 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 28083
I20260430 02:02:27.309587 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 27401
W20260430 02:02:27.310168 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:27.313783 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:27.318879 26966 connection.cc:570] client connection to 127.24.153.65:45451 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20260430 02:02:27.319087 26994 meta_cache.cc:1510] marking tablet server 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451) as failed
I20260430 02:02:27.319299 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.65:45451
--local_ip_for_outbound_sockets=127.24.153.65
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=41493
--webserver_interface=127.24.153.65
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:27.320083 28213 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:27.320363 28213 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:27.321041 28213 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
W20260430 02:02:27.323346 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:27.323968 26371 ts_manager.cc:194] Re-registered known tserver with Master: fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611)
I20260430 02:02:27.324947 26371 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.68:51141
W20260430 02:02:27.359148 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:27.373263 26996 meta_cache.cc:1510] marking tablet server 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451) as failed
W20260430 02:02:27.383728 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:27.420451 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:27.425357 28105 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Log is configured to *not* fsync() on all Append() calls
W20260430 02:02:27.440558 28216 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:27.440812 28216 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:27.440865 28216 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:27.444677 28216 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:27.444882 28216 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.65
I20260430 02:02:27.449512 28216 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.65:45451
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.24.153.65
--webserver_port=41493
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.28216
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.65
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:27.450877 28216 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:27.452136 28216 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:27.455752 28216 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:27.460197 28223 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:27.460309 28224 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:27.460232 28226 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:27.460729 28216 server_base.cc:1061] running on GCE node
I20260430 02:02:27.461205 28216 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:27.461879 28216 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:27.463114 28216 hybrid_clock.cc:648] HybridClock initialized: now 1777514547463090 us; error 32 us; skew 500 ppm
I20260430 02:02:27.465178 28216 webserver.cc:492] Webserver started at http://127.24.153.65:41493/ using document root <none> and password file <none>
I20260430 02:02:27.465875 28216 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:27.466009 28216 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:27.470302 28216 fs_manager.cc:714] Time spent opening directory manager: real 0.003s user 0.000s sys 0.003s
I20260430 02:02:27.472806 28232 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:27.474241 28216 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.002s sys 0.001s
I20260430 02:02:27.474398 28216 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
format_stamp: "Formatted at 2026-04-30 02:02:13 on dist-test-slave-f7mg"
I20260430 02:02:27.474826 28216 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:27.488054 28216 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
W20260430 02:02:27.488396 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:27.488929 28216 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:27.489154 28216 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:27.489820 28216 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
W20260430 02:02:27.490453 27566 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:27.490535 27568 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45240: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:27.492275 28239 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:27.497704 28216 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:27.497840 28216 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.006s user 0.001s sys 0.000s
I20260430 02:02:27.497936 28216 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260430 02:02:27.499747 28216 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:27.499850 28216 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s user 0.000s sys 0.000s
I20260430 02:02:27.500197 28239 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap starting.
I20260430 02:02:27.512552 28216 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.65:45451
I20260430 02:02:27.512573 28346 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.65:45451 every 8 connection(s)
I20260430 02:02:27.513651 28216 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
I20260430 02:02:27.517073 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 28216
I20260430 02:02:27.517218 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 27537
I20260430 02:02:27.528900 28347 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:27.529215 28347 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:27.529268 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.67:43357
--local_ip_for_outbound_sockets=127.24.153.67
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=43087
--webserver_interface=127.24.153.67
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:27.529891 28347 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:27.532565 26369 ts_manager.cc:194] Re-registered known tserver with Master: 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:27.533806 26369 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.65:36395
W20260430 02:02:27.665091 28351 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:27.665360 28351 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:27.665457 28351 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:27.669425 28351 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:27.669652 28351 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.67
I20260430 02:02:27.674615 28351 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.67:43357
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.24.153.67
--webserver_port=43087
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.28351
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.67
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:27.675351 28239 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:27.676041 28351 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:27.677394 28351 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:27.681198 28351 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:27.685738 28360 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:27.686098 28357 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:27.686666 28358 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:27.686748 28351 server_base.cc:1061] running on GCE node
I20260430 02:02:27.687141 28351 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:27.687785 28351 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:27.689015 28351 hybrid_clock.cc:648] HybridClock initialized: now 1777514547688990 us; error 40 us; skew 500 ppm
I20260430 02:02:27.691546 28351 webserver.cc:492] Webserver started at http://127.24.153.67:43087/ using document root <none> and password file <none>
I20260430 02:02:27.692232 28351 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:27.692332 28351 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:27.696357 28351 fs_manager.cc:714] Time spent opening directory manager: real 0.003s user 0.004s sys 0.000s
I20260430 02:02:27.698825 28366 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:27.699958 28351 fs_manager.cc:730] Time spent opening block manager: real 0.002s user 0.003s sys 0.000s
I20260430 02:02:27.700119 28351 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:27.700546 28351 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:27.729244 28351 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:27.730088 28351 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:27.730311 28351 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:27.730983 28351 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:27.732527 28373 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:27.742477 28351 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:27.742578 28351 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.010s user 0.001s sys 0.000s
I20260430 02:02:27.742660 28351 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260430 02:02:27.745751 28351 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:27.745837 28351 ts_tablet_manager.cc:595] Time spent register tablets: real 0.003s user 0.001s sys 0.000s
I20260430 02:02:27.755869 28373 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap starting.
I20260430 02:02:27.759423 28351 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.67:43357
I20260430 02:02:27.760857 28351 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
I20260430 02:02:27.764393 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 28351
I20260430 02:02:27.771433 28480 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.67:43357 every 8 connection(s)
I20260430 02:02:27.816519 28481 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:27.816845 28481 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:27.817660 28481 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:27.820047 26369 ts_manager.cc:194] Re-registered known tserver with Master: 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:27.821071 26369 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.67:44123
I20260430 02:02:27.946638 28373 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:28.108497 28413 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:28.122730 28079 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:28.127031 28281 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:28.148716 28013 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:28.159065 28147 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:28.327884 28213 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:28.536589 28347 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:28.823726 28481 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:30.066708 28373 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap replayed 1/1 log segments. Stats: ops{read=1180 overwritten=0 applied=1180 ignored=0} inserts{seen=58900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260430 02:02:30.067216 28373 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap complete.
I20260430 02:02:30.068645 28373 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Time spent bootstrapping tablet: real 2.315s user 2.203s sys 0.059s
I20260430 02:02:30.071331 28373 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 4 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:30.071831 28373 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5a60f2bd808d4e6da03625fcdb9225e8, State: Initialized, Role: FOLLOWER
I20260430 02:02:30.072499 28373 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1180, Last appended: 3.1180, Last appended by leader: 1180, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:30.073577 28373 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Time spent starting tablet: real 0.005s user 0.007s sys 0.000s
W20260430 02:02:30.087018 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:30.187309 26996 scanner-internal.cc:458] Time spent opening tablet: real 4.025s user 0.005s sys 0.000s
W20260430 02:02:30.261727 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:30.279150 28105 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap replayed 1/1 log segments. Stats: ops{read=1181 overwritten=0 applied=1180 ignored=0} inserts{seen=58900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
W20260430 02:02:30.279462 28392 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:30.279799 28105 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap complete.
I20260430 02:02:30.281502 28105 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Time spent bootstrapping tablet: real 2.988s user 2.830s sys 0.076s
W20260430 02:02:30.283636 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:30.285295 28105 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 4 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:30.287304 28105 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Initialized, Role: FOLLOWER
I20260430 02:02:30.287945 28105 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1180, Last appended: 3.1181, Last appended by leader: 1181, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:30.288936 28105 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Time spent starting tablet: real 0.007s user 0.006s sys 0.000s
I20260430 02:02:30.299114 28519 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 4 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:30.299317 28519 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 4 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:30.300526 28519 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 5 pre-election: Requested pre-vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:30.307509 28167 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 5 candidate_status { last_received { term: 3 index: 1180 } } ignore_live_leader: false dest_uuid: "fdcc2c1450744cb499d898b871318fa0" is_pre_election: true
I20260430 02:02:30.307873 28167 raft_consensus.cc:2410] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 4 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 5a60f2bd808d4e6da03625fcdb9225e8 for term 5 because replica has last-logged OpId of term: 3 index: 1181, which is greater than that of the candidate, which has last-logged OpId of term: 3 index: 1180.
I20260430 02:02:30.318203 28301 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 5 candidate_status { last_received { term: 3 index: 1180 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" is_pre_election: true
W20260430 02:02:30.319955 28368 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 5 pre-election: Tablet error from VoteRequest() call to peer 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:30.320312 28368 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 5 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 5a60f2bd808d4e6da03625fcdb9225e8; no voters: 7ab3773c508c494d8e0246ad5c98fbc0, fdcc2c1450744cb499d898b871318fa0
I20260430 02:02:30.320640 28519 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 4 FOLLOWER]: Leader pre-election lost for term 5. Reason: could not achieve majority
W20260430 02:02:30.379006 28126 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:30.475193 28126 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
I20260430 02:02:30.506515 28239 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap replayed 1/1 log segments. Stats: ops{read=1181 overwritten=0 applied=1180 ignored=0} inserts{seen=58900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260430 02:02:30.507205 28239 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap complete.
I20260430 02:02:30.508685 28239 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Time spent bootstrapping tablet: real 3.009s user 2.891s sys 0.068s
I20260430 02:02:30.511085 28239 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 4 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:30.512688 28239 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7ab3773c508c494d8e0246ad5c98fbc0, State: Initialized, Role: FOLLOWER
I20260430 02:02:30.513311 28239 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1180, Last appended: 3.1181, Last appended by leader: 1181, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:30.514220 28239 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Time spent starting tablet: real 0.005s user 0.008s sys 0.000s
I20260430 02:02:30.552114 28522 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 4 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:30.552341 28522 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 4 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:30.553680 28522 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 5 pre-election: Requested pre-vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:30.558566 28301 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 5 candidate_status { last_received { term: 3 index: 1181 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" is_pre_election: true
I20260430 02:02:30.558964 28301 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 4 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 4.
I20260430 02:02:30.559016 28416 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 5 candidate_status { last_received { term: 3 index: 1181 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
I20260430 02:02:30.559243 28416 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 4 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 4.
I20260430 02:02:30.559500 28100 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 5 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0, fdcc2c1450744cb499d898b871318fa0; no voters:
I20260430 02:02:30.559788 28522 raft_consensus.cc:2804] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 4 FOLLOWER]: Leader pre-election won for term 5
I20260430 02:02:30.559885 28522 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 4 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:02:30.559927 28522 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 4 FOLLOWER]: Advancing to term 5
I20260430 02:02:30.562650 28522 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 5 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:30.563254 28522 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 5 election: Requested vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:30.563691 28416 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 5 candidate_status { last_received { term: 3 index: 1181 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
I20260430 02:02:30.563868 28416 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 4 FOLLOWER]: Advancing to term 5
I20260430 02:02:30.563864 28301 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 5 candidate_status { last_received { term: 3 index: 1181 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
I20260430 02:02:30.564013 28301 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 4 FOLLOWER]: Advancing to term 5
I20260430 02:02:30.566355 28416 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 5 FOLLOWER]: Leader election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 5.
I20260430 02:02:30.566481 28301 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 5 FOLLOWER]: Leader election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 5.
I20260430 02:02:30.566807 28102 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 5 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0; no voters:
I20260430 02:02:30.567034 28522 raft_consensus.cc:2804] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 5 FOLLOWER]: Leader election won for term 5
I20260430 02:02:30.567366 28522 raft_consensus.cc:697] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 5 LEADER]: Becoming Leader. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Running, Role: LEADER
I20260430 02:02:30.567688 28522 consensus_queue.cc:237] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1180, Committed index: 1180, Last appended: 3.1181, Last appended by leader: 1181, Current term: 5, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:30.571275 26369 catalog_manager.cc:5671] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 reported cstate change: term changed from 3 to 5. New cstate: current_term: 5 leader_uuid: "fdcc2c1450744cb499d898b871318fa0" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } health_report { overall_health: UNKNOWN } } }
W20260430 02:02:30.574011 28389 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:30.575506 28389 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:30.648429 28301 raft_consensus.cc:1275] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 5 FOLLOWER]: Refusing update from remote peer fdcc2c1450744cb499d898b871318fa0: Log matching property violated. Preceding OpId in replica: term: 3 index: 1181. Preceding OpId from leader: term: 5 index: 1182. (index mismatch)
I20260430 02:02:30.649124 28522 consensus_queue.cc:1048] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1182, Last known committed idx: 1180, Time since last communication: 0.000s
I20260430 02:02:30.649578 28416 raft_consensus.cc:1275] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 5 FOLLOWER]: Refusing update from remote peer fdcc2c1450744cb499d898b871318fa0: Log matching property violated. Preceding OpId in replica: term: 3 index: 1180. Preceding OpId from leader: term: 5 index: 1182. (index mismatch)
I20260430 02:02:30.651427 28522 consensus_queue.cc:1048] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1182, Last known committed idx: 1180, Time since last communication: 0.000s
W20260430 02:02:30.667788 28389 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:30.676671 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:30.683768 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:31.830947 26994 scanner-internal.cc:458] Time spent opening tablet: real 5.726s user 0.004s sys 0.001s
W20260430 02:02:31.843223 26995 scanner-internal.cc:458] Time spent opening tablet: real 6.022s user 0.005s sys 0.000s
I20260430 02:02:33.717366 28281 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:33.738090 28147 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260430 02:02:33.743588 28013 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:33.766829 28413 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260430 02:02:34.387362 26371 ts_manager.cc:284] Unset tserver state for 5a60f2bd808d4e6da03625fcdb9225e8 from MAINTENANCE_MODE
I20260430 02:02:34.660926 26371 ts_manager.cc:284] Unset tserver state for 7ab3773c508c494d8e0246ad5c98fbc0 from MAINTENANCE_MODE
I20260430 02:02:34.669484 28347 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:34.673056 28213 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:34.675282 28481 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:34.684346 26369 ts_manager.cc:284] Unset tserver state for fdcc2c1450744cb499d898b871318fa0 from MAINTENANCE_MODE
I20260430 02:02:34.706081 26369 ts_manager.cc:284] Unset tserver state for a2771da784d84201a3de0860ab987f1f from MAINTENANCE_MODE
I20260430 02:02:35.136662 28079 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:35.245365 26371 ts_manager.cc:295] Set tserver state for fdcc2c1450744cb499d898b871318fa0 to MAINTENANCE_MODE
I20260430 02:02:35.258888 26371 ts_manager.cc:295] Set tserver state for 7ab3773c508c494d8e0246ad5c98fbc0 to MAINTENANCE_MODE
I20260430 02:02:35.261490 26369 ts_manager.cc:295] Set tserver state for a2771da784d84201a3de0860ab987f1f to MAINTENANCE_MODE
I20260430 02:02:35.342037 26369 ts_manager.cc:295] Set tserver state for 5a60f2bd808d4e6da03625fcdb9225e8 to MAINTENANCE_MODE
I20260430 02:02:35.672987 28347 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:35.676687 28213 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:35.680042 28481 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:35.863405 28147 tablet_service.cc:1460] Tablet server fdcc2c1450744cb499d898b871318fa0 set to quiescing
I20260430 02:02:35.863593 28147 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260430 02:02:35.864490 28554 raft_consensus.cc:993] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: : Instructing follower 7ab3773c508c494d8e0246ad5c98fbc0 to start an election
I20260430 02:02:35.864594 28554 raft_consensus.cc:1081] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 5 LEADER]: Signalling peer 7ab3773c508c494d8e0246ad5c98fbc0 to start an election
I20260430 02:02:35.867823 28603 raft_consensus.cc:993] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: : Instructing follower 7ab3773c508c494d8e0246ad5c98fbc0 to start an election
I20260430 02:02:35.867923 28603 raft_consensus.cc:1081] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 5 LEADER]: Signalling peer 7ab3773c508c494d8e0246ad5c98fbc0 to start an election
I20260430 02:02:35.868971 28300 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6"
dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
from {username='slave'} at 127.24.153.68:37237
I20260430 02:02:35.869194 28300 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 5 FOLLOWER]: Starting forced leader election (received explicit request)
I20260430 02:02:35.869285 28300 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 5 FOLLOWER]: Advancing to term 6
I20260430 02:02:35.870507 28300 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 6 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:35.871049 28299 raft_consensus.cc:1240] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 6 FOLLOWER]: Rejecting Update request from peer fdcc2c1450744cb499d898b871318fa0 for earlier term 5. Current term is 6. Ops: [5.1927-5.1927]
I20260430 02:02:35.871773 28300 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 6 election: Requested vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:35.872476 28603 consensus_queue.cc:1059] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 }, Status: INVALID_TERM, Last received: 5.1926, Next index: 1927, Last known committed idx: 1926, Time since last communication: 0.000s
I20260430 02:02:35.872578 28554 raft_consensus.cc:993] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: : Instructing follower 5a60f2bd808d4e6da03625fcdb9225e8 to start an election
I20260430 02:02:35.872690 28554 raft_consensus.cc:1081] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 5 LEADER]: Signalling peer 5a60f2bd808d4e6da03625fcdb9225e8 to start an election
I20260430 02:02:35.873032 28603 raft_consensus.cc:3055] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 5 LEADER]: Stepping down as leader of term 5
I20260430 02:02:35.873090 28603 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 5 LEADER]: Becoming Follower/Learner. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Running, Role: LEADER
I20260430 02:02:35.873119 28416 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6"
dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
from {username='slave'} at 127.24.153.68:37865
I20260430 02:02:35.873260 28416 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 5 FOLLOWER]: Starting forced leader election (received explicit request)
I20260430 02:02:35.873315 28416 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 5 FOLLOWER]: Advancing to term 6
I20260430 02:02:35.873255 28603 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 1927, Committed index: 1927, Last appended: 5.1927, Last appended by leader: 1927, Current term: 5, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:35.873394 28603 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 5 FOLLOWER]: Advancing to term 6
I20260430 02:02:35.874680 28416 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 6 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:35.875083 28300 raft_consensus.cc:1240] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 6 FOLLOWER]: Rejecting Update request from peer fdcc2c1450744cb499d898b871318fa0 for earlier term 5. Current term is 6. Ops: [5.1927-5.1927]
I20260430 02:02:35.875150 28416 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 6 election: Requested vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:35.875416 28301 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6"
dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
from {username='slave'} at 127.24.153.68:37237
I20260430 02:02:35.875545 28301 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 6 FOLLOWER]: Starting forced leader election (received explicit request)
I20260430 02:02:35.875612 28301 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 6 FOLLOWER]: Advancing to term 7
I20260430 02:02:35.876588 28301 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:35.877175 28301 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 7 election: Requested vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:35.878374 28167 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 6 candidate_status { last_received { term: 5 index: 1927 } } ignore_live_leader: true dest_uuid: "fdcc2c1450744cb499d898b871318fa0"
I20260430 02:02:35.879298 28167 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 6 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 5a60f2bd808d4e6da03625fcdb9225e8 in term 6.
I20260430 02:02:35.879868 28301 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 6 candidate_status { last_received { term: 5 index: 1927 } } ignore_live_leader: true dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
I20260430 02:02:35.880158 28368 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 6 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0; no voters:
W20260430 02:02:35.881320 28125 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
I20260430 02:02:35.881397 28301 raft_consensus.cc:2368] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Leader election vote request: Denying vote to candidate 5a60f2bd808d4e6da03625fcdb9225e8 for earlier term 6. Current term is 7.
I20260430 02:02:35.881907 28368 leader_election.cc:400] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 6 election: Vote denied by peer 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451) with higher term. Message: Invalid argument: T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Leader election vote request: Denying vote to candidate 5a60f2bd808d4e6da03625fcdb9225e8 for earlier term 6. Current term is 7.
W20260430 02:02:35.883623 28125 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.884730 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.886490 28521 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: Cannot assign timestamp to op. Tablet is not in leader mode. Last heard from a leader: 0.013s ago.
I20260430 02:02:35.888411 28167 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 6 candidate_status { last_received { term: 5 index: 1926 } } ignore_live_leader: true dest_uuid: "fdcc2c1450744cb499d898b871318fa0"
I20260430 02:02:35.888423 28166 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 7 candidate_status { last_received { term: 5 index: 1926 } } ignore_live_leader: true dest_uuid: "fdcc2c1450744cb499d898b871318fa0"
I20260430 02:02:35.888664 28167 raft_consensus.cc:2393] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 6 FOLLOWER]: Leader election vote request: Denying vote to candidate 7ab3773c508c494d8e0246ad5c98fbc0 in current term 6: Already voted for candidate 5a60f2bd808d4e6da03625fcdb9225e8 in this term.
W20260430 02:02:35.889882 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.891193 28377 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.891893 28377 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:35.892522 28416 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 6 candidate_status { last_received { term: 5 index: 1926 } } ignore_live_leader: true dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
I20260430 02:02:35.892581 28418 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 7 candidate_status { last_received { term: 5 index: 1926 } } ignore_live_leader: true dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
I20260430 02:02:35.892753 28416 raft_consensus.cc:2393] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 6 FOLLOWER]: Leader election vote request: Denying vote to candidate 7ab3773c508c494d8e0246ad5c98fbc0 in current term 6: Already voted for candidate 5a60f2bd808d4e6da03625fcdb9225e8 in this term.
I20260430 02:02:35.893039 28236 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 7 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0
I20260430 02:02:35.893110 28730 raft_consensus.cc:2804] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 6 FOLLOWER]: Leader election won for term 6
I20260430 02:02:35.893327 28730 raft_consensus.cc:697] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 6 LEADER]: Becoming Leader. State: Replica: 5a60f2bd808d4e6da03625fcdb9225e8, State: Running, Role: LEADER
I20260430 02:02:35.893591 28236 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 6 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0
I20260430 02:02:35.893774 28730 consensus_queue.cc:237] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1927, Committed index: 1927, Last appended: 5.1927, Last appended by leader: 1927, Current term: 6, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:35.894253 28732 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Leader election lost for term 6. Reason: could not achieve majority
W20260430 02:02:35.894420 28377 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Service unavailable: leader is not yet ready
I20260430 02:02:35.895623 28734 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Leader election lost for term 7. Reason: could not achieve majority
I20260430 02:02:35.896775 26369 catalog_manager.cc:5671] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 reported cstate change: term changed from 5 to 6, leader changed from fdcc2c1450744cb499d898b871318fa0 (127.24.153.68) to 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67). New cstate: current_term: 6 leader_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } health_report { overall_health: HEALTHY } } }
W20260430 02:02:35.900434 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.902201 28125 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
I20260430 02:02:35.905376 28301 raft_consensus.cc:1240] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Rejecting Update request from peer 5a60f2bd808d4e6da03625fcdb9225e8 for earlier term 6. Current term is 7. Ops: []
I20260430 02:02:35.905733 28166 raft_consensus.cc:1275] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 6 FOLLOWER]: Refusing update from remote peer 5a60f2bd808d4e6da03625fcdb9225e8: Log matching property violated. Preceding OpId in replica: term: 5 index: 1927. Preceding OpId from leader: term: 6 index: 1929. (index mismatch)
I20260430 02:02:35.906206 28730 consensus_queue.cc:1059] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 }, Status: INVALID_TERM, Last received: 0.0, Next index: 1928, Last known committed idx: 1926, Time since last communication: 0.000s
I20260430 02:02:35.906464 28730 raft_consensus.cc:3055] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 6 LEADER]: Stepping down as leader of term 6
I20260430 02:02:35.906524 28730 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 6 LEADER]: Becoming Follower/Learner. State: Replica: 5a60f2bd808d4e6da03625fcdb9225e8, State: Running, Role: LEADER
I20260430 02:02:35.906682 28730 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 1927, Committed index: 1927, Last appended: 6.1929, Last appended by leader: 1929, Current term: 6, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:35.906955 28730 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 6 FOLLOWER]: Advancing to term 7
W20260430 02:02:35.909235 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.910038 28125 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.918666 28389 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.919123 28392 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.926189 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.928721 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:35.934705 28413 tablet_service.cc:1460] Tablet server 5a60f2bd808d4e6da03625fcdb9225e8 set to quiescing
I20260430 02:02:35.934875 28413 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
W20260430 02:02:35.937822 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.938616 28125 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.947464 28125 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.951819 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.957132 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.968322 28392 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.971832 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.984423 28125 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.984472 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:35.998871 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:35.999675 28281 tablet_service.cc:1460] Tablet server 7ab3773c508c494d8e0246ad5c98fbc0 set to quiescing
I20260430 02:02:35.999840 28281 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260430 02:02:35.999912 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:36.005712 28013 tablet_service.cc:1460] Tablet server a2771da784d84201a3de0860ab987f1f set to quiescing
I20260430 02:02:36.005895 28013 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260430 02:02:36.015618 28392 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.016007 28389 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.031703 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.031703 28125 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.051568 28261 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.052245 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.072696 28378 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.073212 28377 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.093789 28125 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.093789 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.112648 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.113664 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.134562 28377 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.138384 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.155977 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.162257 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.168030 28730 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: failed to trigger leader election: Illegal state: leader elections are disabled
W20260430 02:02:36.179668 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.180970 28724 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: failed to trigger leader election: Illegal state: leader elections are disabled
W20260430 02:02:36.184085 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.205713 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.210256 28377 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.231782 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.237589 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.258170 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.265658 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.286254 28389 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.293113 28377 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.314845 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.323146 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.343667 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.353173 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.373854 28389 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.387332 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.407644 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.420909 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.438709 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.452383 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.475406 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.490161 28389 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.511020 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.524507 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.549043 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.558632 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.559355 28758 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: failed to trigger leader election: Illegal state: leader elections are disabled
W20260430 02:02:36.586580 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.599288 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.622732 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.639159 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.662915 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.678282 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.707387 28389 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.718119 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.747972 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.761581 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.789683 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.806510 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.836869 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.852675 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.879084 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.895725 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.925891 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.942129 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.975605 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:36.993031 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.026336 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.039867 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.073585 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:37.078241 28147 tablet_service.cc:1460] Tablet server fdcc2c1450744cb499d898b871318fa0 set to quiescing
I20260430 02:02:37.078452 28147 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260430 02:02:37.087857 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.125622 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.140357 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:37.154927 28413 tablet_service.cc:1460] Tablet server 5a60f2bd808d4e6da03625fcdb9225e8 set to quiescing
I20260430 02:02:37.155098 28413 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260430 02:02:37.175827 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.191244 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
I20260430 02:02:37.214269 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 27951
I20260430 02:02:37.221256 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.66:44867
--local_ip_for_outbound_sockets=127.24.153.66
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=32993
--webserver_interface=127.24.153.66
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260430 02:02:37.226776 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.240839 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.282855 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.292784 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.335558 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.344199 28781 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:37.344515 28781 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:37.344588 28781 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:37.344915 28127 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54616: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.348574 28781 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:37.348706 28781 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.66
I20260430 02:02:37.353112 28781 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.66:44867
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.24.153.66
--webserver_port=32993
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.28781
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.66
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:37.354256 28781 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:37.355329 28781 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:37.358122 28781 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:37.362525 28786 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:37.362532 28787 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:37.362532 28789 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:37.363480 28781 server_base.cc:1061] running on GCE node
I20260430 02:02:37.363935 28781 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:37.364516 28781 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:37.365687 28781 hybrid_clock.cc:648] HybridClock initialized: now 1777514557365676 us; error 32 us; skew 500 ppm
I20260430 02:02:37.367969 28781 webserver.cc:492] Webserver started at http://127.24.153.66:32993/ using document root <none> and password file <none>
I20260430 02:02:37.368602 28781 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:37.368665 28781 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:37.372505 28781 fs_manager.cc:714] Time spent opening directory manager: real 0.002s user 0.001s sys 0.000s
I20260430 02:02:37.376130 28795 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:37.377324 28781 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.001s sys 0.000s
I20260430 02:02:37.377491 28781 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
uuid: "a2771da784d84201a3de0860ab987f1f"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:37.378058 28781 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260430 02:02:37.391918 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:37.397735 28781 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:37.399106 28781 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:37.399370 28781 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:37.400362 28781 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
W20260430 02:02:37.401139 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:37.401868 28781 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:02:37.401983 28781 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.001s
I20260430 02:02:37.402047 28781 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:02:37.402065 28781 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:37.415319 28781 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.66:44867
I20260430 02:02:37.415349 28908 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.66:44867 every 8 connection(s)
I20260430 02:02:37.416553 28781 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
I20260430 02:02:37.419428 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 28781
I20260430 02:02:37.419589 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 28083
I20260430 02:02:37.429529 28909 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:37.429709 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.68:41611
--local_ip_for_outbound_sockets=127.24.153.68
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=34395
--webserver_interface=127.24.153.68
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:37.429963 28909 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:37.430541 28909 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:37.432176 26369 ts_manager.cc:194] Re-registered known tserver with Master: a2771da784d84201a3de0860ab987f1f (127.24.153.66:44867)
I20260430 02:02:37.432865 26369 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.66:58715
W20260430 02:02:37.447005 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.456923 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.503002 26966 meta_cache.cc:302] tablet 5f11b12f19254ea9a609af28f72cdaa6: replica fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611) has failed: Network error: Client connection negotiation failed: client connection to 127.24.153.68:41611: connect: Connection refused (error 111)
W20260430 02:02:37.512518 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.550482 28913 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:37.550743 28913 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:37.550830 28913 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:37.554601 28913 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:37.554790 28913 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.68
W20260430 02:02:37.558027 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:37.559767 28913 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.68:41611
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.24.153.68
--webserver_port=34395
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.28913
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.68
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:37.561231 28913 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:37.562475 28913 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:37.565374 28913 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:37.570509 28919 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:37.570555 28922 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:37.570749 28913 server_base.cc:1061] running on GCE node
W20260430 02:02:37.570947 28920 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:37.571694 28913 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:37.572826 28913 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
W20260430 02:02:37.573315 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:37.574096 28913 hybrid_clock.cc:648] HybridClock initialized: now 1777514557574051 us; error 68 us; skew 500 ppm
I20260430 02:02:37.579833 28913 webserver.cc:492] Webserver started at http://127.24.153.68:34395/ using document root <none> and password file <none>
I20260430 02:02:37.580734 28913 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:37.580912 28913 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:37.585179 28913 fs_manager.cc:714] Time spent opening directory manager: real 0.003s user 0.002s sys 0.000s
I20260430 02:02:37.587706 28928 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:37.588912 28913 fs_manager.cc:730] Time spent opening block manager: real 0.002s user 0.001s sys 0.000s
I20260430 02:02:37.589077 28913 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
uuid: "fdcc2c1450744cb499d898b871318fa0"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:37.589522 28913 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:37.622407 28913 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:37.623225 28913 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:37.623446 28913 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:37.624058 28913 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:37.625444 28935 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:37.629096 28913 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:37.629227 28913 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.004s user 0.001s sys 0.000s
I20260430 02:02:37.629312 28913 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
W20260430 02:02:37.630162 28260 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57056: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:37.631333 28913 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:37.631385 28913 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s user 0.001s sys 0.000s
I20260430 02:02:37.631830 28935 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap starting.
I20260430 02:02:37.645049 28913 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.68:41611
I20260430 02:02:37.645375 29042 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.68:41611 every 8 connection(s)
I20260430 02:02:37.646255 28913 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
I20260430 02:02:37.648803 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 28913
I20260430 02:02:37.648954 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 28216
I20260430 02:02:37.660221 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.65:45451
--local_ip_for_outbound_sockets=127.24.153.65
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=41493
--webserver_interface=127.24.153.65
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:37.667093 29043 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:37.667627 29043 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:37.669016 29043 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:37.671892 26369 ts_manager.cc:194] Re-registered known tserver with Master: fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611)
I20260430 02:02:37.673255 26369 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.68:50375
W20260430 02:02:37.677546 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:37.699003 26994 meta_cache.cc:1510] marking tablet server 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451) as failed
W20260430 02:02:37.759538 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:37.806740 29046 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:37.807446 29046 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:37.807595 29046 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:37.812588 29046 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:37.812832 29046 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.65
I20260430 02:02:37.817389 29046 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.65:45451
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.24.153.65
--webserver_port=41493
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29046
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.65
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:37.819051 29046 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:37.820336 29046 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:37.823438 29046 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:37.828251 29052 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:37.828239 29055 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:37.828239 29053 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:37.828745 29046 server_base.cc:1061] running on GCE node
I20260430 02:02:37.829277 29046 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:37.830031 29046 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:37.831220 29046 hybrid_clock.cc:648] HybridClock initialized: now 1777514557831190 us; error 44 us; skew 500 ppm
I20260430 02:02:37.833750 29046 webserver.cc:492] Webserver started at http://127.24.153.65:41493/ using document root <none> and password file <none>
I20260430 02:02:37.834544 29046 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:37.834628 29046 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:37.840713 29046 fs_manager.cc:714] Time spent opening directory manager: real 0.004s user 0.002s sys 0.003s
I20260430 02:02:37.843572 29061 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:37.844962 29046 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.001s sys 0.000s
I20260430 02:02:37.845245 29046 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
format_stamp: "Formatted at 2026-04-30 02:02:13 on dist-test-slave-f7mg"
I20260430 02:02:37.845968 29046 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260430 02:02:37.863185 28376 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43546: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:37.864831 28935 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:37.870908 29046 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:37.872252 29046 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:37.872500 29046 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:37.873412 29046 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:37.875159 29069 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:37.879278 29046 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:37.879359 29046 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.004s user 0.001s sys 0.000s
I20260430 02:02:37.879432 29046 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260430 02:02:37.881347 29046 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:37.881444 29046 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s user 0.000s sys 0.000s
I20260430 02:02:37.881783 29069 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap starting.
I20260430 02:02:37.896102 29046 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.65:45451
I20260430 02:02:37.896329 29176 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.65:45451 every 8 connection(s)
I20260430 02:02:37.897255 29046 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
I20260430 02:02:37.904480 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 29046
I20260430 02:02:37.904634 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 28351
I20260430 02:02:37.917732 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.67:43357
--local_ip_for_outbound_sockets=127.24.153.67
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=43087
--webserver_interface=127.24.153.67
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:37.928459 29177 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:37.928781 29177 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:37.929551 29177 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:37.931923 26369 ts_manager.cc:194] Re-registered known tserver with Master: 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:37.932965 26369 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.65:58599
W20260430 02:02:38.058336 29180 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:38.058609 29180 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:38.058713 29180 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:38.062618 29180 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:38.062863 29180 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.67
I20260430 02:02:38.067695 29180 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.67:43357
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.24.153.67
--webserver_port=43087
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29180
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.67
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:38.069093 29180 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:38.070384 29180 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:38.073220 29180 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:38.077849 29186 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:38.078272 29187 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:38.079069 29180 server_base.cc:1061] running on GCE node
W20260430 02:02:38.082118 29189 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:38.082850 29180 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:38.083639 29180 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:38.084847 29180 hybrid_clock.cc:648] HybridClock initialized: now 1777514558084810 us; error 51 us; skew 500 ppm
I20260430 02:02:38.087270 29180 webserver.cc:492] Webserver started at http://127.24.153.67:43087/ using document root <none> and password file <none>
I20260430 02:02:38.087908 29180 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:38.088011 29180 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:38.092170 29180 fs_manager.cc:714] Time spent opening directory manager: real 0.003s user 0.004s sys 0.000s
I20260430 02:02:38.094797 29195 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:38.097048 29180 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.001s
I20260430 02:02:38.097270 29180 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:38.097922 29180 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:38.125876 29180 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:38.126639 29180 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:38.126863 29180 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:38.127519 29180 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:38.128933 29202 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:38.132483 29180 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:38.132615 29180 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.004s user 0.001s sys 0.000s
I20260430 02:02:38.132696 29180 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260430 02:02:38.134646 29180 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:38.134720 29180 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s user 0.002s sys 0.000s
I20260430 02:02:38.135118 29202 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap starting.
I20260430 02:02:38.149889 29180 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.67:43357
I20260430 02:02:38.151232 29180 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
I20260430 02:02:38.151762 29309 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.67:43357 every 8 connection(s)
I20260430 02:02:38.159759 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 29180
I20260430 02:02:38.164662 29310 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:38.165138 29310 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:38.166090 29310 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:38.168550 26369 ts_manager.cc:194] Re-registered known tserver with Master: 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:38.169581 26369 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.67:38539
I20260430 02:02:38.177147 29069 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:38.420078 29202 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:38.435006 28909 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:38.444172 29111 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:38.453887 29244 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:38.466719 28977 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:38.509030 28843 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:38.679435 29043 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:38.935590 29177 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:39.172142 29310 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:41.750396 29069 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap replayed 1/1 log segments. Stats: ops{read=1926 overwritten=0 applied=1926 ignored=0} inserts{seen=96150 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260430 02:02:41.750942 29069 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap complete.
I20260430 02:02:41.752452 29069 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Time spent bootstrapping tablet: real 3.871s user 3.620s sys 0.172s
I20260430 02:02:41.756033 29069 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:41.756453 29069 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7ab3773c508c494d8e0246ad5c98fbc0, State: Initialized, Role: FOLLOWER
I20260430 02:02:41.756996 29069 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1926, Last appended: 5.1926, Last appended by leader: 1926, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:41.758013 29069 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Time spent starting tablet: real 0.005s user 0.008s sys 0.000s
W20260430 02:02:41.878505 29090 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:41.937103 29090 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:41.957896 29090 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:42.040021 29348 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:42.040246 29348 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:42.042769 29348 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 8 pre-election: Requested pre-vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:42.047874 28996 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 8 candidate_status { last_received { term: 5 index: 1926 } } ignore_live_leader: false dest_uuid: "fdcc2c1450744cb499d898b871318fa0" is_pre_election: true
I20260430 02:02:42.048761 29255 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 8 candidate_status { last_received { term: 5 index: 1926 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
W20260430 02:02:42.050770 29063 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611): Illegal state: must be running to vote when last-logged opid is not known
W20260430 02:02:42.050995 29065 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:42.051090 29065 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 8 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0
I20260430 02:02:42.051406 29348 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Leader pre-election lost for term 8. Reason: could not achieve majority
W20260430 02:02:42.149389 29090 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:42.241743 26995 scanner-internal.cc:458] Time spent opening tablet: real 5.727s user 0.006s sys 0.000s
W20260430 02:02:42.288501 29091 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:42.335259 29091 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:42.391067 29091 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:42.452618 29348 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:42.452878 29348 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:42.453328 29348 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 8 pre-election: Requested pre-vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:42.454319 28996 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 8 candidate_status { last_received { term: 5 index: 1926 } } ignore_live_leader: false dest_uuid: "fdcc2c1450744cb499d898b871318fa0" is_pre_election: true
I20260430 02:02:42.454686 29255 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 8 candidate_status { last_received { term: 5 index: 1926 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
W20260430 02:02:42.455179 29065 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
W20260430 02:02:42.455521 29063 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:42.455588 29063 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 8 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0
I20260430 02:02:42.455907 29348 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Leader pre-election lost for term 8. Reason: could not achieve majority
I20260430 02:02:42.465744 28935 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap replayed 1/1 log segments. Stats: ops{read=1927 overwritten=0 applied=1927 ignored=0} inserts{seen=96200 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260430 02:02:42.466475 28935 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap complete.
I20260430 02:02:42.468837 28935 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Time spent bootstrapping tablet: real 4.837s user 4.596s sys 0.148s
I20260430 02:02:42.472467 28935 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 6 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:42.473030 28935 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 6 FOLLOWER]: Becoming Follower/Learner. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Initialized, Role: FOLLOWER
I20260430 02:02:42.473619 28935 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1927, Last appended: 5.1927, Last appended by leader: 1927, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:42.474669 28935 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Time spent starting tablet: real 0.006s user 0.005s sys 0.000s
W20260430 02:02:42.515456 28957 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49770: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:42.523136 26994 scanner-internal.cc:458] Time spent opening tablet: real 6.038s user 0.006s sys 0.000s
W20260430 02:02:42.607733 26996 scanner-internal.cc:458] Time spent opening tablet: real 5.721s user 0.006s sys 0.000s
W20260430 02:02:42.618322 28957 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49770: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:42.626405 28957 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49770: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:42.632638 29090 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:42.720890 29091 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:42.735632 29089 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:42.782364 29354 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 6 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:42.782614 29354 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 6 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:42.784339 29354 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 7 pre-election: Requested pre-vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:42.796985 29131 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 7 candidate_status { last_received { term: 5 index: 1927 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" is_pre_election: true
I20260430 02:02:42.797356 29131 raft_consensus.cc:2393] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate fdcc2c1450744cb499d898b871318fa0 in current term 7: Already voted for candidate 7ab3773c508c494d8e0246ad5c98fbc0 in this term.
I20260430 02:02:42.798465 29255 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 7 candidate_status { last_received { term: 5 index: 1927 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
W20260430 02:02:42.799782 28932 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 7 pre-election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:42.800419 28932 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 7 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: fdcc2c1450744cb499d898b871318fa0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, 7ab3773c508c494d8e0246ad5c98fbc0
I20260430 02:02:42.800976 29354 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 6 FOLLOWER]: Advancing to term 7
I20260430 02:02:42.805020 29354 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 7 FOLLOWER]: Leader pre-election lost for term 7. Reason: could not achieve majority
W20260430 02:02:42.866742 28957 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49770: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:42.927457 28957 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49770: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
W20260430 02:02:42.972282 28957 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49770: Illegal state: replica fdcc2c1450744cb499d898b871318fa0 is not leader of this config: current role FOLLOWER
I20260430 02:02:42.972973 29359 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:42.973150 29359 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:42.974030 29359 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 8 pre-election: Requested pre-vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:42.974026 28996 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 8 candidate_status { last_received { term: 5 index: 1926 } } ignore_live_leader: false dest_uuid: "fdcc2c1450744cb499d898b871318fa0" is_pre_election: true
I20260430 02:02:42.974298 28996 raft_consensus.cc:2410] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 7 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 7ab3773c508c494d8e0246ad5c98fbc0 for term 8 because replica has last-logged OpId of term: 5 index: 1927, which is greater than that of the candidate, which has last-logged OpId of term: 5 index: 1926.
I20260430 02:02:42.974836 29255 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" candidate_term: 8 candidate_status { last_received { term: 5 index: 1926 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
W20260430 02:02:42.975675 29065 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:42.975773 29065 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [CANDIDATE]: Term 8 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, fdcc2c1450744cb499d898b871318fa0
I20260430 02:02:42.975944 29359 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Leader pre-election lost for term 8. Reason: could not achieve majority
W20260430 02:02:42.984766 29089 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:43.029166 29091 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
W20260430 02:02:43.090621 29091 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:54600: Illegal state: replica 7ab3773c508c494d8e0246ad5c98fbc0 is not leader of this config: current role FOLLOWER
I20260430 02:02:43.151562 29354 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 7 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:43.151796 29354 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 7 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:43.152261 29354 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 8 pre-election: Requested pre-vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:43.152688 29131 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 8 candidate_status { last_received { term: 5 index: 1927 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" is_pre_election: true
I20260430 02:02:43.152868 29131 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 7.
I20260430 02:02:43.153242 29255 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 8 candidate_status { last_received { term: 5 index: 1927 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
W20260430 02:02:43.153828 28932 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:43.154349 28930 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 8 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0, fdcc2c1450744cb499d898b871318fa0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8
I20260430 02:02:43.154636 29354 raft_consensus.cc:2804] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 7 FOLLOWER]: Leader pre-election won for term 8
I20260430 02:02:43.154711 29354 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 7 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:02:43.154749 29354 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 7 FOLLOWER]: Advancing to term 8
I20260430 02:02:43.155853 29354 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 8 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:43.156261 29354 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 8 election: Requested vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:43.157310 29255 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 8 candidate_status { last_received { term: 5 index: 1927 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
I20260430 02:02:43.157660 29131 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 8 candidate_status { last_received { term: 5 index: 1927 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
I20260430 02:02:43.157819 29131 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 7 FOLLOWER]: Advancing to term 8
W20260430 02:02:43.158349 28932 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 8 election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:43.160641 29131 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 8 FOLLOWER]: Leader election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 8.
I20260430 02:02:43.161195 28930 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 8 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0, fdcc2c1450744cb499d898b871318fa0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8
I20260430 02:02:43.161602 29354 raft_consensus.cc:2804] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 8 FOLLOWER]: Leader election won for term 8
I20260430 02:02:43.161999 29354 raft_consensus.cc:697] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 8 LEADER]: Becoming Leader. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Running, Role: LEADER
I20260430 02:02:43.162410 29354 consensus_queue.cc:237] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1927, Committed index: 1927, Last appended: 5.1927, Last appended by leader: 1927, Current term: 8, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:43.165953 26369 catalog_manager.cc:5671] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 reported cstate change: term changed from 6 to 8, leader changed from 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67) to fdcc2c1450744cb499d898b871318fa0 (127.24.153.68). New cstate: current_term: 8 leader_uuid: "fdcc2c1450744cb499d898b871318fa0" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } health_report { overall_health: UNKNOWN } } }
W20260430 02:02:43.217288 28932 consensus_peers.cc:597] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 -> Peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Couldn't send request to peer 5a60f2bd808d4e6da03625fcdb9225e8. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 1: this message will repeat every 5th retry.
I20260430 02:02:43.217301 29131 raft_consensus.cc:1275] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 8 FOLLOWER]: Refusing update from remote peer fdcc2c1450744cb499d898b871318fa0: Log matching property violated. Preceding OpId in replica: term: 5 index: 1926. Preceding OpId from leader: term: 8 index: 1929. (index mismatch)
I20260430 02:02:43.217949 29360 consensus_queue.cc:1048] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1928, Last known committed idx: 1926, Time since last communication: 0.000s
I20260430 02:02:43.239095 29367 mvcc.cc:204] Tried to move back new op lower bound from 7280699650929999872 to 7280699650717163520. Current Snapshot: MvccSnapshot[applied={T|T < 7280699650929999872}]
I20260430 02:02:43.283502 29202 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap replayed 1/1 log segments. Stats: ops{read=1929 overwritten=0 applied=1927 ignored=0} inserts{seen=96200 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260430 02:02:43.284344 29202 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap complete.
I20260430 02:02:43.286341 29202 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Time spent bootstrapping tablet: real 5.152s user 4.904s sys 0.143s
I20260430 02:02:43.290463 29202 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 7 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:43.292652 29202 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 7 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5a60f2bd808d4e6da03625fcdb9225e8, State: Initialized, Role: FOLLOWER
I20260430 02:02:43.293336 29202 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1927, Last appended: 6.1929, Last appended by leader: 1929, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:43.294685 29202 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Time spent starting tablet: real 0.008s user 0.006s sys 0.001s
I20260430 02:02:43.320292 29255 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 7 FOLLOWER]: Advancing to term 8
I20260430 02:02:43.323693 29255 pending_rounds.cc:85] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Aborting all ops after (but not including) 1927
I20260430 02:02:43.328042 29255 pending_rounds.cc:107] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Aborting uncommitted NO_OP operation due to leader change: 6.1928
I20260430 02:02:43.328315 29255 raft_consensus.cc:2889] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 8 FOLLOWER]: NO_OP replication failed: Aborted: Op aborted by new leader
I20260430 02:02:43.328745 29255 pending_rounds.cc:107] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Aborting uncommitted WRITE_OP operation due to leader change: 6.1929
I20260430 02:02:43.361334 29377 mvcc.cc:204] Tried to move back new op lower bound from 7280699651313737728 to 7280699650717163520. Current Snapshot: MvccSnapshot[applied={T|T < 7280699651147046912}]
I20260430 02:02:44.136015 28977 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260430 02:02:44.137072 29244 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:44.161517 28843 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:44.161545 29111 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260430 02:02:44.638692 26369 ts_manager.cc:284] Unset tserver state for 7ab3773c508c494d8e0246ad5c98fbc0 from MAINTENANCE_MODE
I20260430 02:02:44.718968 26369 ts_manager.cc:284] Unset tserver state for 5a60f2bd808d4e6da03625fcdb9225e8 from MAINTENANCE_MODE
I20260430 02:02:44.823493 26369 ts_manager.cc:284] Unset tserver state for fdcc2c1450744cb499d898b871318fa0 from MAINTENANCE_MODE
I20260430 02:02:44.859774 26369 ts_manager.cc:284] Unset tserver state for a2771da784d84201a3de0860ab987f1f from MAINTENANCE_MODE
I20260430 02:02:45.250692 29177 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:45.345906 29310 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:45.370632 29043 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:45.456754 28909 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:45.753881 26369 ts_manager.cc:295] Set tserver state for a2771da784d84201a3de0860ab987f1f to MAINTENANCE_MODE
I20260430 02:02:45.798646 26369 ts_manager.cc:295] Set tserver state for 5a60f2bd808d4e6da03625fcdb9225e8 to MAINTENANCE_MODE
I20260430 02:02:45.802912 26369 ts_manager.cc:295] Set tserver state for 7ab3773c508c494d8e0246ad5c98fbc0 to MAINTENANCE_MODE
I20260430 02:02:45.834041 26369 ts_manager.cc:295] Set tserver state for fdcc2c1450744cb499d898b871318fa0 to MAINTENANCE_MODE
I20260430 02:02:46.311282 28977 tablet_service.cc:1460] Tablet server fdcc2c1450744cb499d898b871318fa0 set to quiescing
I20260430 02:02:46.311461 28977 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260430 02:02:46.317188 29111 tablet_service.cc:1460] Tablet server 7ab3773c508c494d8e0246ad5c98fbc0 set to quiescing
I20260430 02:02:46.317358 29111 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:46.326609 29429 raft_consensus.cc:993] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: : Instructing follower 5a60f2bd808d4e6da03625fcdb9225e8 to start an election
I20260430 02:02:46.326748 29429 raft_consensus.cc:1081] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 8 LEADER]: Signalling peer 5a60f2bd808d4e6da03625fcdb9225e8 to start an election
I20260430 02:02:46.331751 29479 raft_consensus.cc:993] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: : Instructing follower 5a60f2bd808d4e6da03625fcdb9225e8 to start an election
I20260430 02:02:46.331986 29479 raft_consensus.cc:1081] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 8 LEADER]: Signalling peer 5a60f2bd808d4e6da03625fcdb9225e8 to start an election
I20260430 02:02:46.332566 29255 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6"
dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
from {username='slave'} at 127.24.153.68:35485
I20260430 02:02:46.332823 29255 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 8 FOLLOWER]: Starting forced leader election (received explicit request)
I20260430 02:02:46.333002 29255 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 8 FOLLOWER]: Advancing to term 9
I20260430 02:02:46.334375 29255 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 9 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:46.335747 29255 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 9 election: Requested vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:46.338318 29261 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6"
dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
from {username='slave'} at 127.24.153.68:35485
I20260430 02:02:46.338565 29261 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 9 FOLLOWER]: Starting forced leader election (received explicit request)
I20260430 02:02:46.338716 29261 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 9 FOLLOWER]: Advancing to term 10
I20260430 02:02:46.340364 29261 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 10 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:46.340960 29261 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 10 election: Requested vote from peers fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611), 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:46.343952 29255 raft_consensus.cc:1240] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 10 FOLLOWER]: Rejecting Update request from peer fdcc2c1450744cb499d898b871318fa0 for earlier term 8. Current term is 10. Ops: [8.2263-8.2263]
I20260430 02:02:46.344435 29376 consensus_queue.cc:1059] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 }, Status: INVALID_TERM, Last received: 8.2262, Next index: 2263, Last known committed idx: 2262, Time since last communication: 0.000s
I20260430 02:02:46.344738 29376 raft_consensus.cc:3055] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 8 LEADER]: Stepping down as leader of term 8
I20260430 02:02:46.344863 29376 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 8 LEADER]: Becoming Follower/Learner. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Running, Role: LEADER
I20260430 02:02:46.346465 29376 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 2262, Committed index: 2262, Last appended: 8.2265, Last appended by leader: 2265, Current term: 8, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:46.346834 29376 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 8 FOLLOWER]: Advancing to term 10
I20260430 02:02:46.352540 28996 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 9 candidate_status { last_received { term: 8 index: 2262 } } ignore_live_leader: true dest_uuid: "fdcc2c1450744cb499d898b871318fa0"
I20260430 02:02:46.352736 28996 raft_consensus.cc:2368] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Leader election vote request: Denying vote to candidate 5a60f2bd808d4e6da03625fcdb9225e8 for earlier term 9. Current term is 10.
I20260430 02:02:46.352953 29130 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 10 candidate_status { last_received { term: 8 index: 2262 } } ignore_live_leader: true dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
I20260430 02:02:46.352989 28997 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 10 candidate_status { last_received { term: 8 index: 2262 } } ignore_live_leader: true dest_uuid: "fdcc2c1450744cb499d898b871318fa0"
I20260430 02:02:46.353084 29130 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 8 FOLLOWER]: Advancing to term 10
I20260430 02:02:46.353139 28997 raft_consensus.cc:2410] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Leader election vote request: Denying vote to candidate 5a60f2bd808d4e6da03625fcdb9225e8 for term 10 because replica has last-logged OpId of term: 8 index: 2265, which is greater than that of the candidate, which has last-logged OpId of term: 8 index: 2262.
I20260430 02:02:46.353130 29131 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" candidate_term: 9 candidate_status { last_received { term: 8 index: 2262 } } ignore_live_leader: true dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
I20260430 02:02:46.354507 29197 leader_election.cc:400] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 9 election: Vote denied by peer fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611) with higher term. Message: Invalid argument: T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Leader election vote request: Denying vote to candidate 5a60f2bd808d4e6da03625fcdb9225e8 for earlier term 9. Current term is 10.
I20260430 02:02:46.354571 29197 leader_election.cc:403] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 9 election: Cancelling election due to peer responding with higher term
I20260430 02:02:46.356180 29130 raft_consensus.cc:2410] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 10 FOLLOWER]: Leader election vote request: Denying vote to candidate 5a60f2bd808d4e6da03625fcdb9225e8 for term 10 because replica has last-logged OpId of term: 8 index: 2265, which is greater than that of the candidate, which has last-logged OpId of term: 8 index: 2262.
I20260430 02:02:46.355334 29564 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 10 FOLLOWER]: Leader election lost for term 9. Reason: Vote denied by peer fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611) with higher term. Message: Invalid argument: T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Leader election vote request: Denying vote to candidate 5a60f2bd808d4e6da03625fcdb9225e8 for earlier term 9. Current term is 10.
I20260430 02:02:46.356796 29197 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [CANDIDATE]: Term 10 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 5a60f2bd808d4e6da03625fcdb9225e8; no voters: 7ab3773c508c494d8e0246ad5c98fbc0, fdcc2c1450744cb499d898b871318fa0
I20260430 02:02:46.357138 29564 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 10 FOLLOWER]: Leader election lost for term 10. Reason: could not achieve majority
I20260430 02:02:46.359006 28843 tablet_service.cc:1460] Tablet server a2771da784d84201a3de0860ab987f1f set to quiescing
I20260430 02:02:46.359251 28843 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:46.431484 29244 tablet_service.cc:1460] Tablet server 5a60f2bd808d4e6da03625fcdb9225e8 set to quiescing
I20260430 02:02:46.431671 29244 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
W20260430 02:02:46.645836 29575 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: failed to trigger leader election: Illegal state: leader elections are disabled
W20260430 02:02:46.719410 29376 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: failed to trigger leader election: Illegal state: leader elections are disabled
W20260430 02:02:46.853828 29564 raft_consensus.cc:670] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: failed to trigger leader election: Illegal state: leader elections are disabled
I20260430 02:02:47.523175 28977 tablet_service.cc:1460] Tablet server fdcc2c1450744cb499d898b871318fa0 set to quiescing
I20260430 02:02:47.523376 28977 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:47.632992 29244 tablet_service.cc:1460] Tablet server 5a60f2bd808d4e6da03625fcdb9225e8 set to quiescing
I20260430 02:02:47.633174 29244 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:47.692654 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 28781
I20260430 02:02:47.700304 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.66:44867
--local_ip_for_outbound_sockets=127.24.153.66
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=32993
--webserver_interface=127.24.153.66
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260430 02:02:47.822407 29598 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:47.822708 29598 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:47.822765 29598 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:47.827656 29598 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:47.827817 29598 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.66
I20260430 02:02:47.832362 29598 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.66:44867
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.24.153.66
--webserver_port=32993
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29598
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.66
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:47.834029 29598 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:47.835415 29598 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:47.838542 29598 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:47.844079 29603 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:47.844118 29604 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:47.844201 29606 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:47.844864 29598 server_base.cc:1061] running on GCE node
I20260430 02:02:47.845427 29598 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:47.846207 29598 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:47.847450 29598 hybrid_clock.cc:648] HybridClock initialized: now 1777514567847390 us; error 79 us; skew 500 ppm
I20260430 02:02:47.849776 29598 webserver.cc:492] Webserver started at http://127.24.153.66:32993/ using document root <none> and password file <none>
I20260430 02:02:47.850490 29598 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:47.850587 29598 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:47.854631 29598 fs_manager.cc:714] Time spent opening directory manager: real 0.003s user 0.004s sys 0.000s
I20260430 02:02:47.856863 29612 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:47.858242 29598 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20260430 02:02:47.858403 29598 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
uuid: "a2771da784d84201a3de0860ab987f1f"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:47.858876 29598 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:47.884876 29598 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:47.885668 29598 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:47.885888 29598 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:47.886583 29598 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:47.887612 29598 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260430 02:02:47.887687 29598 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:47.887756 29598 ts_tablet_manager.cc:616] Registered 0 tablets
I20260430 02:02:47.887848 29598 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:47.898334 29598 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.66:44867
I20260430 02:02:47.898387 29725 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.66:44867 every 8 connection(s)
I20260430 02:02:47.899456 29598 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-1/data/info.pb
I20260430 02:02:47.908305 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 29598
I20260430 02:02:47.908488 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 28913
I20260430 02:02:47.911085 29726 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:47.911362 29726 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:47.911929 29726 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:47.913094 26369 ts_manager.cc:194] Re-registered known tserver with Master: a2771da784d84201a3de0860ab987f1f (127.24.153.66:44867)
I20260430 02:02:47.913789 26369 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.66:46605
W20260430 02:02:47.916944 26966 meta_cache.cc:302] tablet 5f11b12f19254ea9a609af28f72cdaa6: replica fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611) has failed: Network error: recv got EOF from 127.24.153.68:41611 (error 108)
I20260430 02:02:47.917253 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.68:41611
--local_ip_for_outbound_sockets=127.24.153.68
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=34395
--webserver_interface=127.24.153.68
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260430 02:02:48.048813 29730 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:48.049050 29730 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:48.049101 29730 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:48.053197 29730 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:48.053372 29730 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.68
I20260430 02:02:48.057592 29730 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.68:41611
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.24.153.68
--webserver_port=34395
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29730
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.68
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:48.058794 29730 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:48.059983 29730 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:48.062877 29730 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:48.067334 29735 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:48.067328 29738 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:48.067327 29736 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:48.068132 29730 server_base.cc:1061] running on GCE node
I20260430 02:02:48.068572 29730 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:48.069257 29730 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:48.070449 29730 hybrid_clock.cc:648] HybridClock initialized: now 1777514568070412 us; error 58 us; skew 500 ppm
I20260430 02:02:48.072942 29730 webserver.cc:492] Webserver started at http://127.24.153.68:34395/ using document root <none> and password file <none>
I20260430 02:02:48.073673 29730 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:48.073751 29730 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:48.077703 29730 fs_manager.cc:714] Time spent opening directory manager: real 0.003s user 0.003s sys 0.000s
I20260430 02:02:48.080359 29744 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:48.081636 29730 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.000s sys 0.002s
I20260430 02:02:48.081802 29730 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
uuid: "fdcc2c1450744cb499d898b871318fa0"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:48.082319 29730 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:48.102979 29730 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:48.103713 29730 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:48.103924 29730 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:48.104523 29730 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:48.106031 29751 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:48.109459 29730 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:48.109838 29730 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.004s user 0.000s sys 0.000s
I20260430 02:02:48.110086 29730 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260430 02:02:48.112250 29730 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:48.112375 29730 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s user 0.000s sys 0.000s
I20260430 02:02:48.112802 29751 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap starting.
I20260430 02:02:48.124121 29730 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.68:41611
I20260430 02:02:48.124179 29858 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.68:41611 every 8 connection(s)
I20260430 02:02:48.125203 29730 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-3/data/info.pb
I20260430 02:02:48.134214 29859 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:48.134501 29859 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:48.135099 29859 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:48.135294 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 29730
I20260430 02:02:48.135447 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 29046
I20260430 02:02:48.138633 26369 ts_manager.cc:194] Re-registered known tserver with Master: fdcc2c1450744cb499d898b871318fa0 (127.24.153.68:41611)
I20260430 02:02:48.139735 26369 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.68:48559
I20260430 02:02:48.144857 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.65:45451
--local_ip_for_outbound_sockets=127.24.153.65
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=41493
--webserver_interface=127.24.153.65
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260430 02:02:48.148427 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.149945 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.151038 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.175482 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.176491 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.191394 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.195657 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.209587 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.230535 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.235795 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.256585 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.258710 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.294051 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.308213 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.308293 29207 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
W20260430 02:02:48.308269 29862 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:48.308629 29862 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:48.308779 29862 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:48.313361 29862 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:48.313551 29862 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.65
I20260430 02:02:48.318221 29862 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.65:45451
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.24.153.65
--webserver_port=41493
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29862
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.65
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:48.319383 29862 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:48.320578 29862 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:48.323639 29862 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:48.328474 29870 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:48.328541 29869 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:48.328787 29872 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:48.328773 29862 server_base.cc:1061] running on GCE node
I20260430 02:02:48.329226 29862 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:48.329833 29862 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:48.331034 29862 hybrid_clock.cc:648] HybridClock initialized: now 1777514568331018 us; error 28 us; skew 500 ppm
I20260430 02:02:48.333189 29862 webserver.cc:492] Webserver started at http://127.24.153.65:41493/ using document root <none> and password file <none>
I20260430 02:02:48.333781 29862 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:48.333837 29862 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:48.337476 29862 fs_manager.cc:714] Time spent opening directory manager: real 0.002s user 0.002s sys 0.000s
I20260430 02:02:48.339764 29878 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:48.340893 29862 fs_manager.cc:730] Time spent opening block manager: real 0.002s user 0.002s sys 0.000s
I20260430 02:02:48.341028 29862 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
format_stamp: "Formatted at 2026-04-30 02:02:13 on dist-test-slave-f7mg"
I20260430 02:02:48.341435 29862 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260430 02:02:48.345109 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:48.352651 29862 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:48.353544 29862 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:48.353770 29862 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:48.354499 29862 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:48.355801 29885 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:48.359287 29862 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:48.359359 29862 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.004s user 0.001s sys 0.000s
I20260430 02:02:48.359437 29862 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260430 02:02:48.361264 29862 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:48.361337 29862 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s user 0.000s sys 0.000s
I20260430 02:02:48.361644 29885 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap starting.
W20260430 02:02:48.366048 29206 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39516: Illegal state: replica 5a60f2bd808d4e6da03625fcdb9225e8 is not leader of this config: current role FOLLOWER
I20260430 02:02:48.373167 29862 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.65:45451
I20260430 02:02:48.373185 29992 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.65:45451 every 8 connection(s)
I20260430 02:02:48.374366 29862 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-0/data/info.pb
I20260430 02:02:48.384045 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 29862
I20260430 02:02:48.384200 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 29180
I20260430 02:02:48.386590 29751 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:48.382863 29993 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:48.388358 29993 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:48.389191 29993 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:48.391520 26369 ts_manager.cc:194] Re-registered known tserver with Master: 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451)
I20260430 02:02:48.392506 26369 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.65:53849
W20260430 02:02:48.396698 26968 connection.cc:570] client connection to 127.24.153.67:43357 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20260430 02:02:48.397320 25189 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
/tmp/dist-test-taskfXPN2o/build/debug/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.24.153.67:43357
--local_ip_for_outbound_sockets=127.24.153.67
--tserver_master_addrs=127.24.153.126:41269
--webserver_port=43087
--webserver_interface=127.24.153.67
--builtin_ntp_servers=127.24.153.84:44061
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260430 02:02:48.497681 26995 meta_cache.cc:1510] marking tablet server 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357) as failed
W20260430 02:02:48.558583 29997 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260430 02:02:48.558943 29997 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260430 02:02:48.559024 29997 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260430 02:02:48.564726 29997 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260430 02:02:48.564927 29997 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.24.153.67
I20260430 02:02:48.571239 29997 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.24.153.84:44061
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.24.153.67:43357
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.24.153.67
--webserver_port=43087
--tserver_master_addrs=127.24.153.126:41269
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29997
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.24.153.67
--log_dir=/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8537ed5bb5e0e6ac1e481ff961e6b9d08cce7671
build type DEBUG
built by None at 30 Apr 2026 01:43:13 UTC on bdcb31816ec0
build id 11668
I20260430 02:02:48.572940 29997 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260430 02:02:48.574536 29997 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260430 02:02:48.578325 29997 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260430 02:02:48.583482 30006 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:48.583487 30004 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260430 02:02:48.584204 30003 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260430 02:02:48.584982 29997 server_base.cc:1061] running on GCE node
I20260430 02:02:48.585536 29997 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260430 02:02:48.586340 29997 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260430 02:02:48.587531 29997 hybrid_clock.cc:648] HybridClock initialized: now 1777514568587358 us; error 196 us; skew 500 ppm
I20260430 02:02:48.590510 29997 webserver.cc:492] Webserver started at http://127.24.153.67:43087/ using document root <none> and password file <none>
I20260430 02:02:48.591321 29997 fs_manager.cc:362] Metadata directory not provided
I20260430 02:02:48.591432 29997 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260430 02:02:48.596171 29997 fs_manager.cc:714] Time spent opening directory manager: real 0.003s user 0.004s sys 0.000s
I20260430 02:02:48.598960 30012 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20260430 02:02:48.600281 29997 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20260430 02:02:48.600418 29997 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data,/tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
format_stamp: "Formatted at 2026-04-30 02:02:14 on dist-test-slave-f7mg"
I20260430 02:02:48.600970 29997 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260430 02:02:48.650251 26994 meta_cache.cc:1510] marking tablet server 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357) as failed
I20260430 02:02:48.673418 29997 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20260430 02:02:48.674352 29997 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260430 02:02:48.674690 29997 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260430 02:02:48.674575 29885 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:48.675448 29997 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260430 02:02:48.677088 30020 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260430 02:02:48.681318 29997 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260430 02:02:48.681473 29997 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.005s user 0.001s sys 0.000s
I20260430 02:02:48.681604 29997 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260430 02:02:48.683574 29997 ts_tablet_manager.cc:616] Registered 1 tablets
I20260430 02:02:48.683701 29997 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s user 0.002s sys 0.000s
I20260430 02:02:48.683970 30020 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap starting.
I20260430 02:02:48.695806 29997 rpc_server.cc:307] RPC server started. Bound to: 127.24.153.67:43357
I20260430 02:02:48.697417 29997 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0/minicluster-data/ts-2/data/info.pb
I20260430 02:02:48.699426 25189 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu as pid 29997
I20260430 02:02:48.708289 30127 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.24.153.67:43357 every 8 connection(s)
I20260430 02:02:48.725790 30128 heartbeater.cc:344] Connected to a master server at 127.24.153.126:41269
I20260430 02:02:48.726090 30128 heartbeater.cc:461] Registering TS with master...
I20260430 02:02:48.726732 30128 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:48.729012 26371 ts_manager.cc:194] Re-registered known tserver with Master: 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:48.730149 26371 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.24.153.67:37619
I20260430 02:02:48.915879 29726 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:49.022231 30020 log.cc:826] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Log is configured to *not* fsync() on all Append() calls
I20260430 02:02:49.063169 30057 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:49.066340 29908 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:49.081022 29793 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:49.084183 29660 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:49.142158 29859 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:49.395804 29993 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:49.732553 30128 heartbeater.cc:499] Master 127.24.153.126:41269 was elected leader, sending a full tablet report...
I20260430 02:02:52.766609 29751 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap replayed 1/1 log segments. Stats: ops{read=2265 overwritten=0 applied=2262 ignored=0} inserts{seen=112900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260430 02:02:52.767197 29751 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Bootstrap complete.
I20260430 02:02:52.768769 29751 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Time spent bootstrapping tablet: real 4.656s user 4.354s sys 0.184s
I20260430 02:02:52.772257 29751 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:52.774480 29751 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Initialized, Role: FOLLOWER
I20260430 02:02:52.775193 29751 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 2262, Last appended: 8.2265, Last appended by leader: 2265, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:52.776422 29751 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0: Time spent starting tablet: real 0.008s user 0.008s sys 0.002s
I20260430 02:02:53.081013 30167 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:53.081236 30167 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:53.082568 30167 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:53.088683 29936 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 11 candidate_status { last_received { term: 8 index: 2265 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" is_pre_election: true
I20260430 02:02:53.088506 30077 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 11 candidate_status { last_received { term: 8 index: 2265 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
W20260430 02:02:53.090101 29746 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451): Illegal state: must be running to vote when last-logged opid is not known
W20260430 02:02:53.090292 29748 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:53.090354 29746 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: fdcc2c1450744cb499d898b871318fa0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, 7ab3773c508c494d8e0246ad5c98fbc0
I20260430 02:02:53.090600 30167 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Leader pre-election lost for term 11. Reason: could not achieve majority
W20260430 02:02:53.174381 26994 scanner-internal.cc:458] Time spent opening tablet: real 5.636s user 0.005s sys 0.001s
W20260430 02:02:53.317132 26995 scanner-internal.cc:458] Time spent opening tablet: real 6.028s user 0.006s sys 0.000s
I20260430 02:02:53.430294 30167 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:53.430433 30167 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:53.430855 30167 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:53.431372 29936 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 11 candidate_status { last_received { term: 8 index: 2265 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" is_pre_election: true
I20260430 02:02:53.431459 30077 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 11 candidate_status { last_received { term: 8 index: 2265 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
W20260430 02:02:53.431800 29746 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451): Illegal state: must be running to vote when last-logged opid is not known
W20260430 02:02:53.431913 29748 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:53.431998 29748 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: fdcc2c1450744cb499d898b871318fa0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, 7ab3773c508c494d8e0246ad5c98fbc0
I20260430 02:02:53.432248 30167 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Leader pre-election lost for term 11. Reason: could not achieve majority
W20260430 02:02:53.554019 26996 scanner-internal.cc:458] Time spent opening tablet: real 6.054s user 0.007s sys 0.000s
I20260430 02:02:53.677969 29885 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap replaying log segment 1/1 (3.66M/3.90M this segment, stats: ops{read=2123 overwritten=0 applied=2121 ignored=0} inserts{seen=105850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0)
I20260430 02:02:53.853745 30167 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:53.853883 30167 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:53.854480 30167 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:53.854946 29936 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 11 candidate_status { last_received { term: 8 index: 2265 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" is_pre_election: true
I20260430 02:02:53.855011 30077 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 11 candidate_status { last_received { term: 8 index: 2265 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
W20260430 02:02:53.855372 29746 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451): Illegal state: must be running to vote when last-logged opid is not known
W20260430 02:02:53.855690 29748 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:53.855760 29748 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: fdcc2c1450744cb499d898b871318fa0; no voters: 5a60f2bd808d4e6da03625fcdb9225e8, 7ab3773c508c494d8e0246ad5c98fbc0
I20260430 02:02:53.856178 30167 raft_consensus.cc:2749] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Leader pre-election lost for term 11. Reason: could not achieve majority
I20260430 02:02:54.002451 29885 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap replayed 1/1 log segments. Stats: ops{read=2265 overwritten=0 applied=2262 ignored=0} inserts{seen=112900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260430 02:02:54.003125 29885 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Bootstrap complete.
I20260430 02:02:54.005111 29885 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Time spent bootstrapping tablet: real 5.644s user 5.323s sys 0.200s
I20260430 02:02:54.007877 29885 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 10 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:54.010178 29885 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7ab3773c508c494d8e0246ad5c98fbc0, State: Initialized, Role: FOLLOWER
I20260430 02:02:54.010826 29885 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 2262, Last appended: 8.2265, Last appended by leader: 2265, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:54.011868 29885 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0: Time spent starting tablet: real 0.007s user 0.006s sys 0.000s
I20260430 02:02:54.031225 30020 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap replaying log segment 1/1 (3.80M/3.90M this segment, stats: ops{read=2207 overwritten=2 applied=2205 ignored=0} inserts{seen=110050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0)
I20260430 02:02:54.161278 30167 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260430 02:02:54.161407 30167 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:54.161837 30167 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:54.162423 29936 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 11 candidate_status { last_received { term: 8 index: 2265 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" is_pre_election: true
I20260430 02:02:54.162691 29936 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 10 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 10.
I20260430 02:02:54.162676 30077 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 11 candidate_status { last_received { term: 8 index: 2265 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" is_pre_election: true
I20260430 02:02:54.163053 29746 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0, fdcc2c1450744cb499d898b871318fa0; no voters:
W20260430 02:02:54.163148 29748 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:54.163444 30167 raft_consensus.cc:2804] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Leader pre-election won for term 11
I20260430 02:02:54.163504 30167 raft_consensus.cc:493] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260430 02:02:54.163546 30167 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 10 FOLLOWER]: Advancing to term 11
I20260430 02:02:54.166360 30167 raft_consensus.cc:515] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 11 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:54.167516 29936 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 11 candidate_status { last_received { term: 8 index: 2265 } } ignore_live_leader: false dest_uuid: "7ab3773c508c494d8e0246ad5c98fbc0"
I20260430 02:02:54.167675 29936 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 10 FOLLOWER]: Advancing to term 11
I20260430 02:02:54.170049 29936 raft_consensus.cc:2468] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 11 FOLLOWER]: Leader election vote request: Granting yes vote for candidate fdcc2c1450744cb499d898b871318fa0 in term 11.
I20260430 02:02:54.170450 29746 leader_election.cc:304] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7ab3773c508c494d8e0246ad5c98fbc0, fdcc2c1450744cb499d898b871318fa0; no voters:
I20260430 02:02:54.171401 30077 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "5f11b12f19254ea9a609af28f72cdaa6" candidate_uuid: "fdcc2c1450744cb499d898b871318fa0" candidate_term: 11 candidate_status { last_received { term: 8 index: 2265 } } ignore_live_leader: false dest_uuid: "5a60f2bd808d4e6da03625fcdb9225e8"
W20260430 02:02:54.171979 29748 leader_election.cc:343] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 election: Tablet error from VoteRequest() call to peer 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357): Illegal state: must be running to vote when last-logged opid is not known
I20260430 02:02:54.172139 30167 leader_election.cc:290] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [CANDIDATE]: Term 11 election: Requested vote from peers 7ab3773c508c494d8e0246ad5c98fbc0 (127.24.153.65:45451), 5a60f2bd808d4e6da03625fcdb9225e8 (127.24.153.67:43357)
I20260430 02:02:54.172266 30020 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap replayed 1/1 log segments. Stats: ops{read=2264 overwritten=2 applied=2262 ignored=0} inserts{seen=112900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260430 02:02:54.172405 30167 raft_consensus.cc:2804] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 11 FOLLOWER]: Leader election won for term 11
I20260430 02:02:54.172497 30167 raft_consensus.cc:697] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [term 11 LEADER]: Becoming Leader. State: Replica: fdcc2c1450744cb499d898b871318fa0, State: Running, Role: LEADER
I20260430 02:02:54.172935 30020 tablet_bootstrap.cc:492] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Bootstrap complete.
I20260430 02:02:54.172871 30167 consensus_queue.cc:237] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 2262, Committed index: 2262, Last appended: 8.2265, Last appended by leader: 2265, Current term: 11, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:54.174880 30020 ts_tablet_manager.cc:1403] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Time spent bootstrapping tablet: real 5.491s user 5.287s sys 0.104s
I20260430 02:02:54.176342 26369 catalog_manager.cc:5671] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 reported cstate change: term changed from 8 to 11. New cstate: current_term: 11 leader_uuid: "fdcc2c1450744cb499d898b871318fa0" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } health_report { overall_health: UNKNOWN } } }
I20260430 02:02:54.178381 30020 raft_consensus.cc:359] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 10 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:54.178829 30020 raft_consensus.cc:740] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5a60f2bd808d4e6da03625fcdb9225e8, State: Initialized, Role: FOLLOWER
I20260430 02:02:54.179313 30020 consensus_queue.cc:260] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 2262, Last appended: 8.2262, Last appended by leader: 2262, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "fdcc2c1450744cb499d898b871318fa0" member_type: VOTER last_known_addr { host: "127.24.153.68" port: 41611 } } peers { permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 } } peers { permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 } }
I20260430 02:02:54.180410 30020 ts_tablet_manager.cc:1434] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8: Time spent starting tablet: real 0.005s user 0.004s sys 0.000s
I20260430 02:02:54.283762 29936 raft_consensus.cc:1275] T 5f11b12f19254ea9a609af28f72cdaa6 P 7ab3773c508c494d8e0246ad5c98fbc0 [term 11 FOLLOWER]: Refusing update from remote peer fdcc2c1450744cb499d898b871318fa0: Log matching property violated. Preceding OpId in replica: term: 8 index: 2265. Preceding OpId from leader: term: 11 index: 2266. (index mismatch)
I20260430 02:02:54.284682 30176 consensus_queue.cc:1048] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7ab3773c508c494d8e0246ad5c98fbc0" member_type: VOTER last_known_addr { host: "127.24.153.65" port: 45451 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 2266, Last known committed idx: 2262, Time since last communication: 0.000s
I20260430 02:02:54.291435 30077 raft_consensus.cc:3060] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 10 FOLLOWER]: Advancing to term 11
I20260430 02:02:54.291587 30184 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:35470 (ReqId={client: 518d500ccedb41c29c0b71eda1743cbf, seq_no=2258, attempt_no=93}) took 1301 ms. Trace:
I20260430 02:02:54.291587 30185 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:35470 (ReqId={client: 518d500ccedb41c29c0b71eda1743cbf, seq_no=2259, attempt_no=93}) took 1315 ms. Trace:
I20260430 02:02:54.291743 30185 rpcz_store.cc:276] 0430 02:02:52.976332 (+ 0us) service_pool.cc:168] Inserting onto call queue
0430 02:02:52.976422 (+ 90us) service_pool.cc:225] Handling call
0430 02:02:54.291568 (+1315146us) inbound_call.cc:177] Queueing success response
Metrics: {}
I20260430 02:02:54.291731 30184 rpcz_store.cc:276] 0430 02:02:52.990050 (+ 0us) service_pool.cc:168] Inserting onto call queue
0430 02:02:52.990131 (+ 81us) service_pool.cc:225] Handling call
0430 02:02:54.291568 (+1301437us) inbound_call.cc:177] Queueing success response
Metrics: {}
I20260430 02:02:54.293876 30185 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:35470 (ReqId={client: 518d500ccedb41c29c0b71eda1743cbf, seq_no=2260, attempt_no=93}) took 1324 ms. Trace:
I20260430 02:02:54.293993 30185 rpcz_store.cc:276] 0430 02:02:52.968951 (+ 0us) service_pool.cc:168] Inserting onto call queue
0430 02:02:52.969044 (+ 93us) service_pool.cc:225] Handling call
0430 02:02:54.293857 (+1324813us) inbound_call.cc:177] Queueing success response
Metrics: {}
I20260430 02:02:54.297966 30077 raft_consensus.cc:1275] T 5f11b12f19254ea9a609af28f72cdaa6 P 5a60f2bd808d4e6da03625fcdb9225e8 [term 11 FOLLOWER]: Refusing update from remote peer fdcc2c1450744cb499d898b871318fa0: Log matching property violated. Preceding OpId in replica: term: 8 index: 2262. Preceding OpId from leader: term: 11 index: 2266. (index mismatch)
I20260430 02:02:54.298751 30167 consensus_queue.cc:1048] T 5f11b12f19254ea9a609af28f72cdaa6 P fdcc2c1450744cb499d898b871318fa0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "5a60f2bd808d4e6da03625fcdb9225e8" member_type: VOTER last_known_addr { host: "127.24.153.67" port: 43357 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 2266, Last known committed idx: 2262, Time since last communication: 0.000s
I20260430 02:02:54.726073 30057 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:54.737366 29908 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:54.747059 29660 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260430 02:02:54.776305 29793 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20260430 02:02:55.430433 26371 ts_manager.cc:284] Unset tserver state for a2771da784d84201a3de0860ab987f1f from MAINTENANCE_MODE
I20260430 02:02:55.432118 29993 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:55.432631 30128 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:55.433085 29859 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:55.567899 26369 ts_manager.cc:284] Unset tserver state for 5a60f2bd808d4e6da03625fcdb9225e8 from MAINTENANCE_MODE
I20260430 02:02:55.570498 26365 ts_manager.cc:284] Unset tserver state for 7ab3773c508c494d8e0246ad5c98fbc0 from MAINTENANCE_MODE
I20260430 02:02:55.580273 26365 ts_manager.cc:284] Unset tserver state for fdcc2c1450744cb499d898b871318fa0 from MAINTENANCE_MODE
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/maintenance_mode-itest.cc:751: Failure
Value of: s.ok()
Actual: true
Expected: false
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/test_util.cc:402: Failure
Failed
Timed out waiting for assertion to pass.
I20260430 02:02:55.930553 29726 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:55.938882 25189 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20260430 02:02:55.939024 25189 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 7ab3773c508c494d8e0246ad5c98fbc0 and pid 29862
************************ BEGIN STACKS **************************
I20260430 02:02:56.438223 29993 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:56.438385 30128 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
I20260430 02:02:56.441057 29859 heartbeater.cc:507] Master 127.24.153.126:41269 requested a full tablet report, sending...
[New LWP 29865]
[New LWP 29866]
[New LWP 29867]
[New LWP 29868]
[New LWP 29874]
[New LWP 29875]
[New LWP 29876]
[New LWP 29879]
[New LWP 29880]
[New LWP 29881]
[New LWP 29882]
[New LWP 29883]
[New LWP 29884]
[New LWP 29886]
[New LWP 29887]
[New LWP 29888]
[New LWP 29889]
[New LWP 29890]
[New LWP 29891]
[New LWP 29892]
[New LWP 29893]
[New LWP 29894]
[New LWP 29895]
[New LWP 29896]
[New LWP 29897]
[New LWP 29898]
[New LWP 29899]
[New LWP 29900]
[New LWP 29901]
[New LWP 29902]
[New LWP 29903]
[New LWP 29904]
[New LWP 29905]
[New LWP 29906]
[New LWP 29907]
[New LWP 29908]
[New LWP 29909]
[New LWP 29910]
[New LWP 29911]
[New LWP 29912]
[New LWP 29913]
[New LWP 29914]
[New LWP 29915]
[New LWP 29916]
[New LWP 29917]
[New LWP 29918]
[New LWP 29919]
[New LWP 29920]
[New LWP 29921]
[New LWP 29922]
[New LWP 29923]
[New LWP 29924]
[New LWP 29925]
[New LWP 29926]
[New LWP 29927]
[New LWP 29928]
[New LWP 29929]
[New LWP 29930]
[New LWP 29931]
[New LWP 29932]
[New LWP 29933]
[New LWP 29934]
[New LWP 29935]
[New LWP 29936]
[New LWP 29937]
[New LWP 29938]
[New LWP 29939]
[New LWP 29940]
[New LWP 29941]
[New LWP 29942]
[New LWP 29943]
[New LWP 29944]
[New LWP 29945]
[New LWP 29946]
[New LWP 29947]
[New LWP 29948]
[New LWP 29949]
[New LWP 29950]
[New LWP 29951]
[New LWP 29952]
[New LWP 29953]
[New LWP 29954]
[New LWP 29955]
[New LWP 29956]
[New LWP 29957]
[New LWP 29958]
[New LWP 29959]
[New LWP 29960]
[New LWP 29961]
[New LWP 29962]
[New LWP 29963]
[New LWP 29964]
[New LWP 29965]
[New LWP 29966]
[New LWP 29967]
[New LWP 29968]
[New LWP 29969]
[New LWP 29970]
[New LWP 29971]
[New LWP 29972]
[New LWP 29973]
[New LWP 29974]
[New LWP 29975]
[New LWP 29976]
[New LWP 29977]
[New LWP 29978]
[New LWP 29979]
[New LWP 29980]
[New LWP 29981]
[New LWP 29982]
[New LWP 29983]
[New LWP 29984]
[New LWP 29985]
[New LWP 29986]
[New LWP 29987]
[New LWP 29988]
[New LWP 29989]
[New LWP 29990]
[New LWP 29991]
[New LWP 29992]
[New LWP 29993]
[New LWP 29994]
[New LWP 30183]
0x00007f3bceb68d50 in ?? ()
Id Target Id Frame
* 1 LWP 29862 "kudu" 0x00007f3bceb68d50 in ?? ()
2 LWP 29865 "kudu" 0x00007f3bceb64fb9 in ?? ()
3 LWP 29866 "kudu" 0x00007f3bceb64fb9 in ?? ()
4 LWP 29867 "kudu" 0x00007f3bceb64fb9 in ?? ()
5 LWP 29868 "kernel-watcher-" 0x00007f3bceb64fb9 in ?? ()
6 LWP 29874 "ntp client-2987" 0x00007f3bceb689e2 in ?? ()
7 LWP 29875 "file cache-evic" 0x00007f3bceb64fb9 in ?? ()
8 LWP 29876 "sq_acceptor" 0x00007f3bcbebfbb9 in ?? ()
9 LWP 29879 "rpc reactor-298" 0x00007f3bcbecc947 in ?? ()
10 LWP 29880 "rpc reactor-298" 0x00007f3bcbecc947 in ?? ()
11 LWP 29881 "rpc reactor-298" 0x00007f3bcbecc947 in ?? ()
12 LWP 29882 "rpc reactor-298" 0x00007f3bcbecc947 in ?? ()
13 LWP 29883 "MaintenanceMgr " 0x00007f3bceb64ad3 in ?? ()
14 LWP 29884 "txn-status-mana" 0x00007f3bceb64fb9 in ?? ()
15 LWP 29886 "collect_and_rem" 0x00007f3bceb64fb9 in ?? ()
16 LWP 29887 "tc-session-exp-" 0x00007f3bceb64fb9 in ?? ()
17 LWP 29888 "rpc worker-2988" 0x00007f3bceb64ad3 in ?? ()
18 LWP 29889 "rpc worker-2988" 0x00007f3bceb64ad3 in ?? ()
19 LWP 29890 "rpc worker-2989" 0x00007f3bceb64ad3 in ?? ()
20 LWP 29891 "rpc worker-2989" 0x00007f3bceb64ad3 in ?? ()
21 LWP 29892 "rpc worker-2989" 0x00007f3bceb64ad3 in ?? ()
22 LWP 29893 "rpc worker-2989" 0x00007f3bceb64ad3 in ?? ()
23 LWP 29894 "rpc worker-2989" 0x00007f3bceb64ad3 in ?? ()
24 LWP 29895 "rpc worker-2989" 0x00007f3bceb64ad3 in ?? ()
25 LWP 29896 "rpc worker-2989" 0x00007f3bceb64ad3 in ?? ()
26 LWP 29897 "rpc worker-2989" 0x00007f3bceb64ad3 in ?? ()
27 LWP 29898 "rpc worker-2989" 0x00007f3bceb64ad3 in ?? ()
28 LWP 29899 "rpc worker-2989" 0x00007f3bceb64ad3 in ?? ()
29 LWP 29900 "rpc worker-2990" 0x00007f3bceb64ad3 in ?? ()
30 LWP 29901 "rpc worker-2990" 0x00007f3bceb64ad3 in ?? ()
31 LWP 29902 "rpc worker-2990" 0x00007f3bceb64ad3 in ?? ()
32 LWP 29903 "rpc worker-2990" 0x00007f3bceb64ad3 in ?? ()
33 LWP 29904 "rpc worker-2990" 0x00007f3bceb64ad3 in ?? ()
34 LWP 29905 "rpc worker-2990" 0x00007f3bceb64ad3 in ?? ()
35 LWP 29906 "rpc worker-2990" 0x00007f3bceb64ad3 in ?? ()
36 LWP 29907 "rpc worker-2990" 0x00007f3bceb64ad3 in ?? ()
37 LWP 29908 "rpc worker-2990" 0x00007f3bceb64ad3 in ?? ()
38 LWP 29909 "rpc worker-2990" 0x00007f3bceb64ad3 in ?? ()
39 LWP 29910 "rpc worker-2991" 0x00007f3bceb64ad3 in ?? ()
40 LWP 29911 "rpc worker-2991" 0x00007f3bceb64ad3 in ?? ()
41 LWP 29912 "rpc worker-2991" 0x00007f3bceb64ad3 in ?? ()
42 LWP 29913 "rpc worker-2991" 0x00007f3bceb64ad3 in ?? ()
43 LWP 29914 "rpc worker-2991" 0x00007f3bceb64ad3 in ?? ()
44 LWP 29915 "rpc worker-2991" 0x00007f3bceb64ad3 in ?? ()
45 LWP 29916 "rpc worker-2991" 0x00007f3bceb64ad3 in ?? ()
46 LWP 29917 "rpc worker-2991" 0x00007f3bceb64ad3 in ?? ()
47 LWP 29918 "rpc worker-2991" 0x00007f3bceb64ad3 in ?? ()
48 LWP 29919 "rpc worker-2991" 0x00007f3bceb64ad3 in ?? ()
49 LWP 29920 "rpc worker-2992" 0x00007f3bceb64ad3 in ?? ()
50 LWP 29921 "rpc worker-2992" 0x00007f3bceb64ad3 in ?? ()
51 LWP 29922 "rpc worker-2992" 0x00007f3bceb64ad3 in ?? ()
52 LWP 29923 "rpc worker-2992" 0x00007f3bceb64ad3 in ?? ()
53 LWP 29924 "rpc worker-2992" 0x00007f3bceb64ad3 in ?? ()
54 LWP 29925 "rpc worker-2992" 0x00007f3bceb64ad3 in ?? ()
55 LWP 29926 "rpc worker-2992" 0x00007f3bceb64ad3 in ?? ()
56 LWP 29927 "rpc worker-2992" 0x00007f3bceb64ad3 in ?? ()
57 LWP 29928 "rpc worker-2992" 0x00007f3bceb64ad3 in ?? ()
58 LWP 29929 "rpc worker-2992" 0x00007f3bceb64ad3 in ?? ()
59 LWP 29930 "rpc worker-2993" 0x00007f3bceb64ad3 in ?? ()
60 LWP 29931 "rpc worker-2993" 0x00007f3bceb64ad3 in ?? ()
61 LWP 29932 "rpc worker-2993" 0x00007f3bceb64ad3 in ?? ()
62 LWP 29933 "rpc worker-2993" 0x00007f3bceb64ad3 in ?? ()
63 LWP 29934 "rpc worker-2993" 0x00007f3bceb64ad3 in ?? ()
64 LWP 29935 "rpc worker-2993" 0x00007f3bceb64ad3 in ?? ()
65 LWP 29936 "rpc worker-2993" 0x00007f3bceb64ad3 in ?? ()
66 LWP 29937 "rpc worker-2993" 0x00007f3bceb64ad3 in ?? ()
67 LWP 29938 "rpc worker-2993" 0x00007f3bceb64ad3 in ?? ()
68 LWP 29939 "rpc worker-2993" 0x00007f3bceb64ad3 in ?? ()
69 LWP 29940 "rpc worker-2994" 0x00007f3bceb64ad3 in ?? ()
70 LWP 29941 "rpc worker-2994" 0x00007f3bceb64ad3 in ?? ()
71 LWP 29942 "rpc worker-2994" 0x00007f3bceb64ad3 in ?? ()
72 LWP 29943 "rpc worker-2994" 0x00007f3bceb64ad3 in ?? ()
73 LWP 29944 "rpc worker-2994" 0x00007f3bceb64ad3 in ?? ()
74 LWP 29945 "rpc worker-2994" 0x00007f3bceb64ad3 in ?? ()
75 LWP 29946 "rpc worker-2994" 0x00007f3bceb64ad3 in ?? ()
76 LWP 29947 "rpc worker-2994" 0x00007f3bceb64ad3 in ?? ()
77 LWP 29948 "rpc worker-2994" 0x00007f3bceb64ad3 in ?? ()
78 LWP 29949 "rpc worker-2994" 0x00007f3bceb64ad3 in ?? ()
79 LWP 29950 "rpc worker-2995" 0x00007f3bceb64ad3 in ?? ()
80 LWP 29951 "rpc worker-2995" 0x00007f3bceb64ad3 in ?? ()
81 LWP 29952 "rpc worker-2995" 0x00007f3bceb64ad3 in ?? ()
82 LWP 29953 "rpc worker-2995" 0x00007f3bceb64ad3 in ?? ()
83 LWP 29954 "rpc worker-2995" 0x00007f3bceb64ad3 in ?? ()
84 LWP 29955 "rpc worker-2995" 0x00007f3bceb64ad3 in ?? ()
85 LWP 29956 "rpc worker-2995" 0x00007f3bceb64ad3 in ?? ()
86 LWP 29957 "rpc worker-2995" 0x00007f3bceb64ad3 in ?? ()
87 LWP 29958 "rpc worker-2995" 0x00007f3bceb64ad3 in ?? ()
88 LWP 29959 "rpc worker-2995" 0x00007f3bceb64ad3 in ?? ()
89 LWP 29960 "rpc worker-2996" 0x00007f3bceb64ad3 in ?? ()
90 LWP 29961 "rpc worker-2996" 0x00007f3bceb64ad3 in ?? ()
91 LWP 29962 "rpc worker-2996" 0x00007f3bceb64ad3 in ?? ()
92 LWP 29963 "rpc worker-2996" 0x00007f3bceb64ad3 in ?? ()
93 LWP 29964 "rpc worker-2996" 0x00007f3bceb64ad3 in ?? ()
94 LWP 29965 "rpc worker-2996" 0x00007f3bceb64ad3 in ?? ()
95 LWP 29966 "rpc worker-2996" 0x00007f3bceb64ad3 in ?? ()
96 LWP 29967 "rpc worker-2996" 0x00007f3bceb64ad3 in ?? ()
97 LWP 29968 "rpc worker-2996" 0x00007f3bceb64ad3 in ?? ()
98 LWP 29969 "rpc worker-2996" 0x00007f3bceb64ad3 in ?? ()
99 LWP 29970 "rpc worker-2997" 0x00007f3bceb64ad3 in ?? ()
100 LWP 29971 "rpc worker-2997" 0x00007f3bceb64ad3 in ?? ()
101 LWP 29972 "rpc worker-2997" 0x00007f3bceb64ad3 in ?? ()
102 LWP 29973 "rpc worker-2997" 0x00007f3bceb64ad3 in ?? ()
103 LWP 29974 "rpc worker-2997" 0x00007f3bceb64ad3 in ?? ()
104 LWP 29975 "rpc worker-2997" 0x00007f3bceb64ad3 in ?? ()
105 LWP 29976 "rpc worker-2997" 0x00007f3bceb64ad3 in ?? ()
106 LWP 29977 "rpc worker-2997" 0x00007f3bceb64ad3 in ?? ()
107 LWP 29978 "rpc worker-2997" 0x00007f3bceb64ad3 in ?? ()
108 LWP 29979 "rpc worker-2997" 0x00007f3bceb64ad3 in ?? ()
109 LWP 29980 "rpc worker-2998" 0x00007f3bceb64ad3 in ?? ()
110 LWP 29981 "rpc worker-2998" 0x00007f3bceb64ad3 in ?? ()
111 LWP 29982 "rpc worker-2998" 0x00007f3bceb64ad3 in ?? ()
112 LWP 29983 "rpc worker-2998" 0x00007f3bceb64ad3 in ?? ()
113 LWP 29984 "rpc worker-2998" 0x00007f3bceb64ad3 in ?? ()
114 LWP 29985 "rpc worker-2998" 0x00007f3bceb64ad3 in ?? ()
115 LWP 29986 "rpc worker-2998" 0x00007f3bceb64ad3 in ?? ()
116 LWP 29987 "rpc worker-2998" 0x00007f3bceb64ad3 in ?? ()
117 LWP 29988 "diag-logger-299" 0x00007f3bceb64fb9 in ?? ()
118 LWP 29989 "result-tracker-" 0x00007f3bceb64fb9 in ?? ()
119 LWP 29990 "excess-log-dele" 0x00007f3bceb64fb9 in ?? ()
120 LWP 29991 "tcmalloc-memory" 0x00007f3bceb64fb9 in ?? ()
121 LWP 29992 "acceptor-29992" 0x00007f3bcbecdfc7 in ?? ()
122 LWP 29993 "heartbeat-29993" 0x00007f3bceb64fb9 in ?? ()
123 LWP 29994 "maintenance_sch" 0x00007f3bceb64fb9 in ?? ()
124 LWP 30183 "wal-append [wor" 0x00007f3bceb64fb9 in ?? ()
Thread 124 (LWP 30183):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3b840040f0 in ?? ()
#2 0x000000000000013a in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000564a0f54cb30 in ?? ()
#5 0x00007f3b84004110 in ?? ()
#6 0x0000000000000274 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 123 (LWP 29994):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3b84805330 in ?? ()
#2 0x0000000000000020 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000564a0f2d3e60 in ?? ()
#5 0x00007f3b84805350 in ?? ()
#6 0x0000000000000040 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 29993):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3b85006310 in ?? ()
#2 0x000000000000000b in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000564a0f225640 in ?? ()
#5 0x00007f3b85006330 in ?? ()
#6 0x0000000000000016 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 121 (LWP 29992):
#0 0x00007f3bcbecdfc7 in ?? ()
#1 0x00007f3b85807128 in ?? ()
#2 0x00007f3b8580712c in ?? ()
#3 0x00007f3b85807130 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 120 (LWP 29991):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3b860084f0 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007fff4cb1ba80 in ?? ()
#5 0x00007f3b86008510 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 29990):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3b86809480 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 118 (LWP 29989):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3b8700a4e0 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000564a0f1de830 in ?? ()
#5 0x00007f3b8700a500 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 29988):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3b8780b450 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000564a0f4e6be0 in ?? ()
#5 0x00007f3b8780b470 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 29987):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 115 (LWP 29986):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 114 (LWP 29985):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 29984):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 29983):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 29982):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 29981):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 29980):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 29979):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 29978):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 29977):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 29976):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 29975):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 29974):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 29973):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 29972):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 29971):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 29970):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 29969):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000564a0f4fa1fc in ?? ()
#4 0x00007f3b9101e430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f3b9101e450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000564a0f4a3570 in ?? ()
#9 0x00007f3bceb64770 in ?? ()
#10 0x00007f3b9101e450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 97 (LWP 29968):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000007 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000564a0f4fa3ac in ?? ()
#4 0x00007f3b9181f430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f3b9181f450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000564a0f4a3450 in ?? ()
#9 0x00007f3bceb64770 in ?? ()
#10 0x00007f3b9181f450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 96 (LWP 29967):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 29966):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 29965):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 29964):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 29963):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 29962):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 29961):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 29960):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 29959):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 29958):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 29957):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 29956):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 29955):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 29954):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 29953):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 29952):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 29951):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 29950):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 29949):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 29948):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 29947):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 75 (LWP 29946):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 74 (LWP 29945):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 29944):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 29943):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 29942):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 29941):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 29940):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 29939):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 29938):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 29937):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 29936):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x000000000000008f in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000564a0f4e8c1c in ?? ()
#4 0x00007f3ba183f430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f3ba183f450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000564a0f49cfd0 in ?? ()
#9 0x00007f3bceb64770 in ?? ()
#10 0x00007f3ba183f450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 64 (LWP 29935):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 29934):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000091 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000564a0f4e804c in ?? ()
#4 0x00007f3ba2841430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f3ba2841450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000564a0f49cd90 in ?? ()
#9 0x00007f3bceb64770 in ?? ()
#10 0x00007f3ba2841450 in ?? ()
#11 0x00007f3ba2841470 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 62 (LWP 29933):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 29932):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 29931):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 29930):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 29929):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 29928):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 29927):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 55 (LWP 29926):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 29925):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 29924):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 29923):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 29922):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 29921):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 29920):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 29919):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 29918):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 29917):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 29916):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 29915):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 29914):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 29913):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 29912):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 29911):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 29910):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 29909):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 29908):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000564a0f4b2318 in ?? ()
#4 0x00007f3baf85b430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f3baf85b450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 36 (LWP 29907):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 35 (LWP 29906):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 34 (LWP 29905):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 33 (LWP 29904):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 32 (LWP 29903):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x000000000000001a in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000564a0f4b2a68 in ?? ()
#4 0x00007f3bb2060430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f3bb2060450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 31 (LWP 29902):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000040 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000564a0f4b2c18 in ?? ()
#4 0x00007f3bb2861430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f3bb2861450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 30 (LWP 29901):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000564a0f4b2dcc in ?? ()
#4 0x00007f3bb3062430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f3bb3062450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000564a0f48c7f0 in ?? ()
#9 0x00007f3bceb64770 in ?? ()
#10 0x00007f3bb3062450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 29 (LWP 29900):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 29899):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 29898):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 29897):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 29896):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 29895):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 29894):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 29893):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 29892):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 29891):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 29890):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 29889):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 29888):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 29887):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3bba070390 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 15 (LWP 29886):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3bba871540 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000564a0f289098 in ?? ()
#5 0x00007f3bba871560 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 29884):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3bbb8730c0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 13 (LWP 29883):
#0 0x00007f3bceb64ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 29882):
#0 0x00007f3bcbecc947 in ?? ()
#1 0x00007f3bbc875580 in ?? ()
#2 0x00007f3bcf1975c7 in ?? ()
#3 0x00007f3bbc875580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 11 (LWP 29881):
#0 0x00007f3bcbecc947 in ?? ()
#1 0x00007f3bbd076580 in ?? ()
#2 0x00007f3bcf1975c7 in ?? ()
#3 0x00007f3bbd076580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 10 (LWP 29880):
#0 0x00007f3bcbecc947 in ?? ()
#1 0x00007f3bbd877580 in ?? ()
#2 0x00007f3bcf1975c7 in ?? ()
#3 0x00007f3bbd877580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 9 (LWP 29879):
#0 0x00007f3bcbecc947 in ?? ()
#1 0x00007f3bbf45b580 in ?? ()
#2 0x00007f3bcf1975c7 in ?? ()
#3 0x00007f3bbf45b580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 8 (LWP 29876):
#0 0x00007f3bcbebfbb9 in ?? ()
#1 0x00007f3bc0c5e800 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 29875):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3bc045d4e0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 29874):
#0 0x00007f3bceb689e2 in ?? ()
#1 0x00007f3bbfc5c330 in ?? ()
#2 0x00007f3bbfc5c380 in ?? ()
#3 0x00007f3bbfc5c350 in ?? ()
#4 0x0000000000000001 in ?? ()
#5 0x00007f3bbfc5c6c0 in ?? ()
#6 0x0000564a0f2b27f0 in ?? ()
#7 0x0000564a0f13cfd0 in ?? ()
#8 0x0000564a0f13cfc0 in ?? ()
#9 0x00007fff4cb1a050 in ?? ()
#10 0x00007f3bcf69ed72 in ?? ()
#11 0x0000564a0f2b0bd0 in ?? ()
#12 0x0000564a0f1de2d0 in ?? ()
#13 0x00007f3bbfc5c390 in ?? ()
#14 0x00007f3bcdfe9040 in ?? ()
#15 0x00007f3b00000010 in ?? ()
#16 0x00007f3bce11df69 in ?? ()
#17 0x00007f3bbfc5c3b0 in ?? ()
#18 0x0000564a0f1c1dd0 in ?? ()
#19 0x00007f3bcdfe9040 in ?? ()
#20 0x0000000000000000 in ?? ()
Thread 5 (LWP 29868):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3bc1c602b0 in ?? ()
#2 0x0000000000000028 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000564a0f13d2e8 in ?? ()
#5 0x00007f3bc1c602d0 in ?? ()
#6 0x0000000000000050 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 29867):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3bc24615c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000564a0f288438 in ?? ()
#5 0x00007f3bc24615e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 29866):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3bc2c625c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000564a0f2882f8 in ?? ()
#5 0x00007f3bc2c625e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 29865):
#0 0x00007f3bceb64fb9 in ?? ()
#1 0x00007f3bc34635c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000564a0f2881b8 in ?? ()
#5 0x00007f3bc34635e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 29862):
#0 0x00007f3bceb68d50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260430 02:02:56.551689 25189 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID a2771da784d84201a3de0860ab987f1f and pid 29598
************************ BEGIN STACKS **************************
[New LWP 29599]
[New LWP 29600]
[New LWP 29601]
[New LWP 29602]
[New LWP 29608]
[New LWP 29609]
[New LWP 29610]
[New LWP 29613]
[New LWP 29614]
[New LWP 29615]
[New LWP 29616]
[New LWP 29617]
[New LWP 29618]
[New LWP 29619]
[New LWP 29620]
[New LWP 29621]
[New LWP 29622]
[New LWP 29623]
[New LWP 29624]
[New LWP 29625]
[New LWP 29626]
[New LWP 29627]
[New LWP 29628]
[New LWP 29629]
[New LWP 29630]
[New LWP 29631]
[New LWP 29632]
[New LWP 29633]
[New LWP 29634]
[New LWP 29635]
[New LWP 29636]
[New LWP 29637]
[New LWP 29638]
[New LWP 29639]
[New LWP 29640]
[New LWP 29641]
[New LWP 29642]
[New LWP 29643]
[New LWP 29644]
[New LWP 29645]
[New LWP 29646]
[New LWP 29647]
[New LWP 29648]
[New LWP 29649]
[New LWP 29650]
[New LWP 29651]
[New LWP 29652]
[New LWP 29653]
[New LWP 29654]
[New LWP 29655]
[New LWP 29656]
[New LWP 29657]
[New LWP 29658]
[New LWP 29659]
[New LWP 29660]
[New LWP 29661]
[New LWP 29662]
[New LWP 29663]
[New LWP 29664]
[New LWP 29665]
[New LWP 29666]
[New LWP 29667]
[New LWP 29668]
[New LWP 29669]
[New LWP 29670]
[New LWP 29671]
[New LWP 29672]
[New LWP 29673]
[New LWP 29674]
[New LWP 29675]
[New LWP 29676]
[New LWP 29677]
[New LWP 29678]
[New LWP 29679]
[New LWP 29680]
[New LWP 29681]
[New LWP 29682]
[New LWP 29683]
[New LWP 29684]
[New LWP 29685]
[New LWP 29686]
[New LWP 29687]
[New LWP 29688]
[New LWP 29689]
[New LWP 29690]
[New LWP 29691]
[New LWP 29692]
[New LWP 29693]
[New LWP 29694]
[New LWP 29695]
[New LWP 29696]
[New LWP 29697]
[New LWP 29698]
[New LWP 29699]
[New LWP 29700]
[New LWP 29701]
[New LWP 29702]
[New LWP 29703]
[New LWP 29704]
[New LWP 29705]
[New LWP 29706]
[New LWP 29707]
[New LWP 29708]
[New LWP 29709]
[New LWP 29710]
[New LWP 29711]
[New LWP 29712]
[New LWP 29713]
[New LWP 29714]
[New LWP 29715]
[New LWP 29716]
[New LWP 29717]
[New LWP 29718]
[New LWP 29719]
[New LWP 29720]
[New LWP 29721]
[New LWP 29722]
[New LWP 29723]
[New LWP 29724]
[New LWP 29725]
[New LWP 29726]
[New LWP 29727]
0x00007f475b43dd50 in ?? ()
Id Target Id Frame
* 1 LWP 29598 "kudu" 0x00007f475b43dd50 in ?? ()
2 LWP 29599 "kudu" 0x00007f475b439fb9 in ?? ()
3 LWP 29600 "kudu" 0x00007f475b439fb9 in ?? ()
4 LWP 29601 "kudu" 0x00007f475b439fb9 in ?? ()
5 LWP 29602 "kernel-watcher-" 0x00007f475b439fb9 in ?? ()
6 LWP 29608 "ntp client-2960" 0x00007f475b43d9e2 in ?? ()
7 LWP 29609 "file cache-evic" 0x00007f475b439fb9 in ?? ()
8 LWP 29610 "sq_acceptor" 0x00007f4758794bb9 in ?? ()
9 LWP 29613 "rpc reactor-296" 0x00007f47587a1947 in ?? ()
10 LWP 29614 "rpc reactor-296" 0x00007f47587a1947 in ?? ()
11 LWP 29615 "rpc reactor-296" 0x00007f47587a1947 in ?? ()
12 LWP 29616 "rpc reactor-296" 0x00007f47587a1947 in ?? ()
13 LWP 29617 "MaintenanceMgr " 0x00007f475b439ad3 in ?? ()
14 LWP 29618 "txn-status-mana" 0x00007f475b439fb9 in ?? ()
15 LWP 29619 "collect_and_rem" 0x00007f475b439fb9 in ?? ()
16 LWP 29620 "tc-session-exp-" 0x00007f475b439fb9 in ?? ()
17 LWP 29621 "rpc worker-2962" 0x00007f475b439ad3 in ?? ()
18 LWP 29622 "rpc worker-2962" 0x00007f475b439ad3 in ?? ()
19 LWP 29623 "rpc worker-2962" 0x00007f475b439ad3 in ?? ()
20 LWP 29624 "rpc worker-2962" 0x00007f475b439ad3 in ?? ()
21 LWP 29625 "rpc worker-2962" 0x00007f475b439ad3 in ?? ()
22 LWP 29626 "rpc worker-2962" 0x00007f475b439ad3 in ?? ()
23 LWP 29627 "rpc worker-2962" 0x00007f475b439ad3 in ?? ()
24 LWP 29628 "rpc worker-2962" 0x00007f475b439ad3 in ?? ()
25 LWP 29629 "rpc worker-2962" 0x00007f475b439ad3 in ?? ()
26 LWP 29630 "rpc worker-2963" 0x00007f475b439ad3 in ?? ()
27 LWP 29631 "rpc worker-2963" 0x00007f475b439ad3 in ?? ()
28 LWP 29632 "rpc worker-2963" 0x00007f475b439ad3 in ?? ()
29 LWP 29633 "rpc worker-2963" 0x00007f475b439ad3 in ?? ()
30 LWP 29634 "rpc worker-2963" 0x00007f475b439ad3 in ?? ()
31 LWP 29635 "rpc worker-2963" 0x00007f475b439ad3 in ?? ()
32 LWP 29636 "rpc worker-2963" 0x00007f475b439ad3 in ?? ()
33 LWP 29637 "rpc worker-2963" 0x00007f475b439ad3 in ?? ()
34 LWP 29638 "rpc worker-2963" 0x00007f475b439ad3 in ?? ()
35 LWP 29639 "rpc worker-2963" 0x00007f475b439ad3 in ?? ()
36 LWP 29640 "rpc worker-2964" 0x00007f475b439ad3 in ?? ()
37 LWP 29641 "rpc worker-2964" 0x00007f475b439ad3 in ?? ()
38 LWP 29642 "rpc worker-2964" 0x00007f475b439ad3 in ?? ()
39 LWP 29643 "rpc worker-2964" 0x00007f475b439ad3 in ?? ()
40 LWP 29644 "rpc worker-2964" 0x00007f475b439ad3 in ?? ()
41 LWP 29645 "rpc worker-2964" 0x00007f475b439ad3 in ?? ()
42 LWP 29646 "rpc worker-2964" 0x00007f475b439ad3 in ?? ()
43 LWP 29647 "rpc worker-2964" 0x00007f475b439ad3 in ?? ()
44 LWP 29648 "rpc worker-2964" 0x00007f475b439ad3 in ?? ()
45 LWP 29649 "rpc worker-2964" 0x00007f475b439ad3 in ?? ()
46 LWP 29650 "rpc worker-2965" 0x00007f475b439ad3 in ?? ()
47 LWP 29651 "rpc worker-2965" 0x00007f475b439ad3 in ?? ()
48 LWP 29652 "rpc worker-2965" 0x00007f475b439ad3 in ?? ()
49 LWP 29653 "rpc worker-2965" 0x00007f475b439ad3 in ?? ()
50 LWP 29654 "rpc worker-2965" 0x00007f475b439ad3 in ?? ()
51 LWP 29655 "rpc worker-2965" 0x00007f475b439ad3 in ?? ()
52 LWP 29656 "rpc worker-2965" 0x00007f475b439ad3 in ?? ()
53 LWP 29657 "rpc worker-2965" 0x00007f475b439ad3 in ?? ()
54 LWP 29658 "rpc worker-2965" 0x00007f475b439ad3 in ?? ()
55 LWP 29659 "rpc worker-2965" 0x00007f475b439ad3 in ?? ()
56 LWP 29660 "rpc worker-2966" 0x00007f475b439ad3 in ?? ()
57 LWP 29661 "rpc worker-2966" 0x00007f475b439ad3 in ?? ()
58 LWP 29662 "rpc worker-2966" 0x00007f475b439ad3 in ?? ()
59 LWP 29663 "rpc worker-2966" 0x00007f475b439ad3 in ?? ()
60 LWP 29664 "rpc worker-2966" 0x00007f475b439ad3 in ?? ()
61 LWP 29665 "rpc worker-2966" 0x00007f475b439ad3 in ?? ()
62 LWP 29666 "rpc worker-2966" 0x00007f475b439ad3 in ?? ()
63 LWP 29667 "rpc worker-2966" 0x00007f475b439ad3 in ?? ()
64 LWP 29668 "rpc worker-2966" 0x00007f475b439ad3 in ?? ()
65 LWP 29669 "rpc worker-2966" 0x00007f475b439ad3 in ?? ()
66 LWP 29670 "rpc worker-2967" 0x00007f475b439ad3 in ?? ()
67 LWP 29671 "rpc worker-2967" 0x00007f475b439ad3 in ?? ()
68 LWP 29672 "rpc worker-2967" 0x00007f475b439ad3 in ?? ()
69 LWP 29673 "rpc worker-2967" 0x00007f475b439ad3 in ?? ()
70 LWP 29674 "rpc worker-2967" 0x00007f475b439ad3 in ?? ()
71 LWP 29675 "rpc worker-2967" 0x00007f475b439ad3 in ?? ()
72 LWP 29676 "rpc worker-2967" 0x00007f475b439ad3 in ?? ()
73 LWP 29677 "rpc worker-2967" 0x00007f475b439ad3 in ?? ()
74 LWP 29678 "rpc worker-2967" 0x00007f475b439ad3 in ?? ()
75 LWP 29679 "rpc worker-2967" 0x00007f475b439ad3 in ?? ()
76 LWP 29680 "rpc worker-2968" 0x00007f475b439ad3 in ?? ()
77 LWP 29681 "rpc worker-2968" 0x00007f475b439ad3 in ?? ()
78 LWP 29682 "rpc worker-2968" 0x00007f475b439ad3 in ?? ()
79 LWP 29683 "rpc worker-2968" 0x00007f475b439ad3 in ?? ()
80 LWP 29684 "rpc worker-2968" 0x00007f475b439ad3 in ?? ()
81 LWP 29685 "rpc worker-2968" 0x00007f475b439ad3 in ?? ()
82 LWP 29686 "rpc worker-2968" 0x00007f475b439ad3 in ?? ()
83 LWP 29687 "rpc worker-2968" 0x00007f475b439ad3 in ?? ()
84 LWP 29688 "rpc worker-2968" 0x00007f475b439ad3 in ?? ()
85 LWP 29689 "rpc worker-2968" 0x00007f475b439ad3 in ?? ()
86 LWP 29690 "rpc worker-2969" 0x00007f475b439ad3 in ?? ()
87 LWP 29691 "rpc worker-2969" 0x00007f475b439ad3 in ?? ()
88 LWP 29692 "rpc worker-2969" 0x00007f475b439ad3 in ?? ()
89 LWP 29693 "rpc worker-2969" 0x00007f475b439ad3 in ?? ()
90 LWP 29694 "rpc worker-2969" 0x00007f475b439ad3 in ?? ()
91 LWP 29695 "rpc worker-2969" 0x00007f475b439ad3 in ?? ()
92 LWP 29696 "rpc worker-2969" 0x00007f475b439ad3 in ?? ()
93 LWP 29697 "rpc worker-2969" 0x00007f475b439ad3 in ?? ()
94 LWP 29698 "rpc worker-2969" 0x00007f475b439ad3 in ?? ()
95 LWP 29699 "rpc worker-2969" 0x00007f475b439ad3 in ?? ()
96 LWP 29700 "rpc worker-2970" 0x00007f475b439ad3 in ?? ()
97 LWP 29701 "rpc worker-2970" 0x00007f475b439ad3 in ?? ()
98 LWP 29702 "rpc worker-2970" 0x00007f475b439ad3 in ?? ()
99 LWP 29703 "rpc worker-2970" 0x00007f475b439ad3 in ?? ()
100 LWP 29704 "rpc worker-2970" 0x00007f475b439ad3 in ?? ()
101 LWP 29705 "rpc worker-2970" 0x00007f475b439ad3 in ?? ()
102 LWP 29706 "rpc worker-2970" 0x00007f475b439ad3 in ?? ()
103 LWP 29707 "rpc worker-2970" 0x00007f475b439ad3 in ?? ()
104 LWP 29708 "rpc worker-2970" 0x00007f475b439ad3 in ?? ()
105 LWP 29709 "rpc worker-2970" 0x00007f475b439ad3 in ?? ()
106 LWP 29710 "rpc worker-2971" 0x00007f475b439ad3 in ?? ()
107 LWP 29711 "rpc worker-2971" 0x00007f475b439ad3 in ?? ()
108 LWP 29712 "rpc worker-2971" 0x00007f475b439ad3 in ?? ()
109 LWP 29713 "rpc worker-2971" 0x00007f475b439ad3 in ?? ()
110 LWP 29714 "rpc worker-2971" 0x00007f475b439ad3 in ?? ()
111 LWP 29715 "rpc worker-2971" 0x00007f475b439ad3 in ?? ()
112 LWP 29716 "rpc worker-2971" 0x00007f475b439ad3 in ?? ()
113 LWP 29717 "rpc worker-2971" 0x00007f475b439ad3 in ?? ()
114 LWP 29718 "rpc worker-2971" 0x00007f475b439ad3 in ?? ()
115 LWP 29719 "rpc worker-2971" 0x00007f475b439ad3 in ?? ()
116 LWP 29720 "rpc worker-2972" 0x00007f475b439ad3 in ?? ()
117 LWP 29721 "diag-logger-297" 0x00007f475b439fb9 in ?? ()
118 LWP 29722 "result-tracker-" 0x00007f475b439fb9 in ?? ()
119 LWP 29723 "excess-log-dele" 0x00007f475b439fb9 in ?? ()
120 LWP 29724 "tcmalloc-memory" 0x00007f475b439fb9 in ?? ()
121 LWP 29725 "acceptor-29725" 0x00007f47587a2fc7 in ?? ()
122 LWP 29726 "heartbeat-29726" 0x00007f475b439fb9 in ?? ()
123 LWP 29727 "maintenance_sch" 0x00007f475b439fb9 in ?? ()
Thread 123 (LWP 29727):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f47118db330 in ?? ()
#2 0x0000000000000024 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056052b30de60 in ?? ()
#5 0x00007f47118db350 in ?? ()
#6 0x0000000000000048 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 29726):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f47120dc310 in ?? ()
#2 0x0000000000000009 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056052b25f640 in ?? ()
#5 0x00007f47120dc330 in ?? ()
#6 0x0000000000000012 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 121 (LWP 29725):
#0 0x00007f47587a2fc7 in ?? ()
#1 0x00007f47128dd128 in ?? ()
#2 0x00007f47128dd12c in ?? ()
#3 0x00007f47128dd130 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 120 (LWP 29724):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f47130de4f0 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffdd97d4310 in ?? ()
#5 0x00007f47130de510 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 29723):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f47138df480 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 118 (LWP 29722):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f47140e04e0 in ?? ()
#2 0x0000000000000009 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056052b218830 in ?? ()
#5 0x00007f47140e0500 in ?? ()
#6 0x0000000000000012 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 29721):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f47148e1450 in ?? ()
#2 0x0000000000000009 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056052b4aa2e0 in ?? ()
#5 0x00007f47148e1470 in ?? ()
#6 0x0000000000000012 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 29720):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000004 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000056052b4a7878 in ?? ()
#4 0x00007f47150e2430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f47150e2450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 115 (LWP 29719):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000004 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000056052b4a76c8 in ?? ()
#4 0x00007f47158e3430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f47158e3450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 114 (LWP 29718):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 29717):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 29716):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 29715):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 29714):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 29713):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 29712):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 29711):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 29710):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 29709):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 29708):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 29707):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 29706):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 29705):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 29704):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 29703):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 29702):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 29701):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 29700):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 29699):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 29698):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 29697):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 29696):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 29695):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 29694):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 29693):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 29692):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 29691):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 29690):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 29689):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 29688):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 29687):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 29686):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 29685):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 29684):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 29683):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 29682):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 29681):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 29680):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000056052b472708 in ?? ()
#4 0x00007f472910a430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f472910a450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 75 (LWP 29679):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 74 (LWP 29678):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 29677):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 29676):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 29675):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 29674):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 29673):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 29672):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 29671):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 29670):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 29669):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 29668):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 29667):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 29666):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 29665):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 29664):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 29663):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 29662):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 29661):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 29660):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000056052b467368 in ?? ()
#4 0x00007f473311e430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f473311e450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 55 (LWP 29659):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 29658):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 29657):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 29656):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 29655):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 29654):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 29653):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 29652):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 29651):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 29650):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 29649):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 29648):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 29647):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 29646):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 29645):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 29644):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 29643):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 29642):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 29641):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 36 (LWP 29640):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000056052b464048 in ?? ()
#4 0x00007f473d132430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f473d132450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 35 (LWP 29639):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 34 (LWP 29638):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 33 (LWP 29637):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 32 (LWP 29636):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 31 (LWP 29635):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 30 (LWP 29634):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 29633):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 29632):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 29631):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 29630):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 29629):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 29628):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 29627):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 29626):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 29625):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 29624):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 29623):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 29622):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 29621):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 29620):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f4747146390 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 15 (LWP 29619):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f4747947540 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056052b2c3098 in ?? ()
#5 0x00007f4747947560 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 29618):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f47481480c0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 13 (LWP 29617):
#0 0x00007f475b439ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 29616):
#0 0x00007f47587a1947 in ?? ()
#1 0x00007f474914a580 in ?? ()
#2 0x00007f475ba6c5c7 in ?? ()
#3 0x00007f474914a580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 11 (LWP 29615):
#0 0x00007f47587a1947 in ?? ()
#1 0x00007f474994b580 in ?? ()
#2 0x00007f475ba6c5c7 in ?? ()
#3 0x00007f474994b580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 10 (LWP 29614):
#0 0x00007f47587a1947 in ?? ()
#1 0x00007f474a14c580 in ?? ()
#2 0x00007f475ba6c5c7 in ?? ()
#3 0x00007f474a14c580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 9 (LWP 29613):
#0 0x00007f47587a1947 in ?? ()
#1 0x00007f474bd30580 in ?? ()
#2 0x00007f475ba6c5c7 in ?? ()
#3 0x00007f474bd30580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 8 (LWP 29610):
#0 0x00007f4758794bb9 in ?? ()
#1 0x00007f474d533800 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 29609):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f474cd324e0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 29608):
#0 0x00007f475b43d9e2 in ?? ()
#1 0x00007f474c531330 in ?? ()
#2 0x00007f474c531380 in ?? ()
#3 0x00007f474c531350 in ?? ()
#4 0x0000000000000001 in ?? ()
#5 0x00007f474c5316c0 in ?? ()
#6 0x000056052b2ec7f0 in ?? ()
#7 0x000056052b176fd0 in ?? ()
#8 0x000056052b176fc0 in ?? ()
#9 0x00007ffdd97d28e0 in ?? ()
#10 0x00007f475bf73d72 in ?? ()
#11 0x000056052b2eab10 in ?? ()
#12 0x000056052b2182d0 in ?? ()
#13 0x00007f474c531390 in ?? ()
#14 0x00007f475a8be040 in ?? ()
#15 0x00007f4700000010 in ?? ()
#16 0x00007f475a9f2f69 in ?? ()
#17 0x00007f474c5313b0 in ?? ()
#18 0x000056052b1fbdd0 in ?? ()
#19 0x00007f475a8be040 in ?? ()
#20 0x0000000000000000 in ?? ()
Thread 5 (LWP 29602):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f474e5352b0 in ?? ()
#2 0x000000000000002e in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056052b1772e8 in ?? ()
#5 0x00007f474e5352d0 in ?? ()
#6 0x000000000000005c in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 29601):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f474ed365c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056052b2c2438 in ?? ()
#5 0x00007f474ed365e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 29600):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f474f5375c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056052b2c22f8 in ?? ()
#5 0x00007f474f5375e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 29599):
#0 0x00007f475b439fb9 in ?? ()
#1 0x00007f474fd385c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056052b2c21b8 in ?? ()
#5 0x00007f474fd385e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 29598):
#0 0x00007f475b43dd50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260430 02:02:57.164177 25189 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID 5a60f2bd808d4e6da03625fcdb9225e8 and pid 29997
************************ BEGIN STACKS **************************
[New LWP 29999]
[New LWP 30000]
[New LWP 30001]
[New LWP 30002]
[New LWP 30008]
[New LWP 30009]
[New LWP 30010]
[New LWP 30013]
[New LWP 30014]
[New LWP 30015]
[New LWP 30016]
[New LWP 30018]
[New LWP 30019]
[New LWP 30021]
[New LWP 30022]
[New LWP 30023]
[New LWP 30024]
[New LWP 30025]
[New LWP 30026]
[New LWP 30027]
[New LWP 30028]
[New LWP 30029]
[New LWP 30030]
[New LWP 30031]
[New LWP 30032]
[New LWP 30033]
[New LWP 30034]
[New LWP 30035]
[New LWP 30036]
[New LWP 30037]
[New LWP 30038]
[New LWP 30039]
[New LWP 30040]
[New LWP 30041]
[New LWP 30042]
[New LWP 30043]
[New LWP 30044]
[New LWP 30045]
[New LWP 30046]
[New LWP 30047]
[New LWP 30048]
[New LWP 30049]
[New LWP 30050]
[New LWP 30051]
[New LWP 30052]
[New LWP 30053]
[New LWP 30054]
[New LWP 30055]
[New LWP 30056]
[New LWP 30057]
[New LWP 30058]
[New LWP 30059]
[New LWP 30060]
[New LWP 30061]
[New LWP 30062]
[New LWP 30063]
[New LWP 30064]
[New LWP 30065]
[New LWP 30066]
[New LWP 30067]
[New LWP 30068]
[New LWP 30069]
[New LWP 30070]
[New LWP 30071]
[New LWP 30072]
[New LWP 30073]
[New LWP 30074]
[New LWP 30075]
[New LWP 30076]
[New LWP 30077]
[New LWP 30078]
[New LWP 30079]
[New LWP 30080]
[New LWP 30081]
[New LWP 30082]
[New LWP 30083]
[New LWP 30084]
[New LWP 30085]
[New LWP 30086]
[New LWP 30087]
[New LWP 30088]
[New LWP 30089]
[New LWP 30090]
[New LWP 30091]
[New LWP 30092]
[New LWP 30093]
[New LWP 30094]
[New LWP 30095]
[New LWP 30096]
[New LWP 30097]
[New LWP 30098]
[New LWP 30099]
[New LWP 30100]
[New LWP 30101]
[New LWP 30102]
[New LWP 30103]
[New LWP 30104]
[New LWP 30105]
[New LWP 30106]
[New LWP 30107]
[New LWP 30108]
[New LWP 30109]
[New LWP 30110]
[New LWP 30111]
[New LWP 30112]
[New LWP 30113]
[New LWP 30114]
[New LWP 30115]
[New LWP 30116]
[New LWP 30117]
[New LWP 30118]
[New LWP 30119]
[New LWP 30120]
[New LWP 30121]
[New LWP 30122]
[New LWP 30123]
[New LWP 30124]
[New LWP 30125]
[New LWP 30126]
[New LWP 30127]
[New LWP 30128]
[New LWP 30129]
0x00007fa3f0893d50 in ?? ()
Id Target Id Frame
* 1 LWP 29997 "kudu" 0x00007fa3f0893d50 in ?? ()
2 LWP 29999 "kudu" 0x00007fa3f088ffb9 in ?? ()
3 LWP 30000 "kudu" 0x00007fa3f088ffb9 in ?? ()
4 LWP 30001 "kudu" 0x00007fa3f088ffb9 in ?? ()
5 LWP 30002 "kernel-watcher-" 0x00007fa3f088ffb9 in ?? ()
6 LWP 30008 "ntp client-3000" 0x00007fa3f08939e2 in ?? ()
7 LWP 30009 "file cache-evic" 0x00007fa3f088ffb9 in ?? ()
8 LWP 30010 "sq_acceptor" 0x00007fa3edbeabb9 in ?? ()
9 LWP 30013 "rpc reactor-300" 0x00007fa3edbf7947 in ?? ()
10 LWP 30014 "rpc reactor-300" 0x00007fa3edbf7947 in ?? ()
11 LWP 30015 "rpc reactor-300" 0x00007fa3edbf7947 in ?? ()
12 LWP 30016 "rpc reactor-300" 0x00007fa3edbf7947 in ?? ()
13 LWP 30018 "MaintenanceMgr " 0x00007fa3f088fad3 in ?? ()
14 LWP 30019 "txn-status-mana" 0x00007fa3f088ffb9 in ?? ()
15 LWP 30021 "collect_and_rem" 0x00007fa3f088ffb9 in ?? ()
16 LWP 30022 "tc-session-exp-" 0x00007fa3f088ffb9 in ?? ()
17 LWP 30023 "rpc worker-3002" 0x00007fa3f088fad3 in ?? ()
18 LWP 30024 "rpc worker-3002" 0x00007fa3f088fad3 in ?? ()
19 LWP 30025 "rpc worker-3002" 0x00007fa3f088fad3 in ?? ()
20 LWP 30026 "rpc worker-3002" 0x00007fa3f088fad3 in ?? ()
21 LWP 30027 "rpc worker-3002" 0x00007fa3f088fad3 in ?? ()
22 LWP 30028 "rpc worker-3002" 0x00007fa3f088fad3 in ?? ()
23 LWP 30029 "rpc worker-3002" 0x00007fa3f088fad3 in ?? ()
24 LWP 30030 "rpc worker-3003" 0x00007fa3f088fad3 in ?? ()
25 LWP 30031 "rpc worker-3003" 0x00007fa3f088fad3 in ?? ()
26 LWP 30032 "rpc worker-3003" 0x00007fa3f088fad3 in ?? ()
27 LWP 30033 "rpc worker-3003" 0x00007fa3f088fad3 in ?? ()
28 LWP 30034 "rpc worker-3003" 0x00007fa3f088fad3 in ?? ()
29 LWP 30035 "rpc worker-3003" 0x00007fa3f088fad3 in ?? ()
30 LWP 30036 "rpc worker-3003" 0x00007fa3f088fad3 in ?? ()
31 LWP 30037 "rpc worker-3003" 0x00007fa3f088fad3 in ?? ()
32 LWP 30038 "rpc worker-3003" 0x00007fa3f088fad3 in ?? ()
33 LWP 30039 "rpc worker-3003" 0x00007fa3f088fad3 in ?? ()
34 LWP 30040 "rpc worker-3004" 0x00007fa3f088fad3 in ?? ()
35 LWP 30041 "rpc worker-3004" 0x00007fa3f088fad3 in ?? ()
36 LWP 30042 "rpc worker-3004" 0x00007fa3f088fad3 in ?? ()
37 LWP 30043 "rpc worker-3004" 0x00007fa3f088fad3 in ?? ()
38 LWP 30044 "rpc worker-3004" 0x00007fa3f088fad3 in ?? ()
39 LWP 30045 "rpc worker-3004" 0x00007fa3f088fad3 in ?? ()
40 LWP 30046 "rpc worker-3004" 0x00007fa3f088fad3 in ?? ()
41 LWP 30047 "rpc worker-3004" 0x00007fa3f088fad3 in ?? ()
42 LWP 30048 "rpc worker-3004" 0x00007fa3f088fad3 in ?? ()
43 LWP 30049 "rpc worker-3004" 0x00007fa3f088fad3 in ?? ()
44 LWP 30050 "rpc worker-3005" 0x00007fa3f088fad3 in ?? ()
45 LWP 30051 "rpc worker-3005" 0x00007fa3f088fad3 in ?? ()
46 LWP 30052 "rpc worker-3005" 0x00007fa3f088fad3 in ?? ()
47 LWP 30053 "rpc worker-3005" 0x00007fa3f088fad3 in ?? ()
48 LWP 30054 "rpc worker-3005" 0x00007fa3f088fad3 in ?? ()
49 LWP 30055 "rpc worker-3005" 0x00007fa3f088fad3 in ?? ()
50 LWP 30056 "rpc worker-3005" 0x00007fa3f088fad3 in ?? ()
51 LWP 30057 "rpc worker-3005" 0x00007fa3f088fad3 in ?? ()
52 LWP 30058 "rpc worker-3005" 0x00007fa3f088fad3 in ?? ()
53 LWP 30059 "rpc worker-3005" 0x00007fa3f088fad3 in ?? ()
54 LWP 30060 "rpc worker-3006" 0x00007fa3f088fad3 in ?? ()
55 LWP 30061 "rpc worker-3006" 0x00007fa3f088fad3 in ?? ()
56 LWP 30062 "rpc worker-3006" 0x00007fa3f088fad3 in ?? ()
57 LWP 30063 "rpc worker-3006" 0x00007fa3f088fad3 in ?? ()
58 LWP 30064 "rpc worker-3006" 0x00007fa3f088fad3 in ?? ()
59 LWP 30065 "rpc worker-3006" 0x00007fa3f088fad3 in ?? ()
60 LWP 30066 "rpc worker-3006" 0x00007fa3f088fad3 in ?? ()
61 LWP 30067 "rpc worker-3006" 0x00007fa3f088fad3 in ?? ()
62 LWP 30068 "rpc worker-3006" 0x00007fa3f088fad3 in ?? ()
63 LWP 30069 "rpc worker-3006" 0x00007fa3f088fad3 in ?? ()
64 LWP 30070 "rpc worker-3007" 0x00007fa3f088fad3 in ?? ()
65 LWP 30071 "rpc worker-3007" 0x00007fa3f088fad3 in ?? ()
66 LWP 30072 "rpc worker-3007" 0x00007fa3f088fad3 in ?? ()
67 LWP 30073 "rpc worker-3007" 0x00007fa3f088fad3 in ?? ()
68 LWP 30074 "rpc worker-3007" 0x00007fa3f088fad3 in ?? ()
69 LWP 30075 "rpc worker-3007" 0x00007fa3f088fad3 in ?? ()
70 LWP 30076 "rpc worker-3007" 0x00007fa3f088fad3 in ?? ()
71 LWP 30077 "rpc worker-3007" 0x00007fa3f088fad3 in ?? ()
72 LWP 30078 "rpc worker-3007" 0x00007fa3f088fad3 in ?? ()
73 LWP 30079 "rpc worker-3007" 0x00007fa3f088fad3 in ?? ()
74 LWP 30080 "rpc worker-3008" 0x00007fa3f088fad3 in ?? ()
75 LWP 30081 "rpc worker-3008" 0x00007fa3f088fad3 in ?? ()
76 LWP 30082 "rpc worker-3008" 0x00007fa3f088fad3 in ?? ()
77 LWP 30083 "rpc worker-3008" 0x00007fa3f088fad3 in ?? ()
78 LWP 30084 "rpc worker-3008" 0x00007fa3f088fad3 in ?? ()
79 LWP 30085 "rpc worker-3008" 0x00007fa3f088fad3 in ?? ()
80 LWP 30086 "rpc worker-3008" 0x00007fa3f088fad3 in ?? ()
81 LWP 30087 "rpc worker-3008" 0x00007fa3f088fad3 in ?? ()
82 LWP 30088 "rpc worker-3008" 0x00007fa3f088fad3 in ?? ()
83 LWP 30089 "rpc worker-3008" 0x00007fa3f088fad3 in ?? ()
84 LWP 30090 "rpc worker-3009" 0x00007fa3f088fad3 in ?? ()
85 LWP 30091 "rpc worker-3009" 0x00007fa3f088fad3 in ?? ()
86 LWP 30092 "rpc worker-3009" 0x00007fa3f088fad3 in ?? ()
87 LWP 30093 "rpc worker-3009" 0x00007fa3f088fad3 in ?? ()
88 LWP 30094 "rpc worker-3009" 0x00007fa3f088fad3 in ?? ()
89 LWP 30095 "rpc worker-3009" 0x00007fa3f088fad3 in ?? ()
90 LWP 30096 "rpc worker-3009" 0x00007fa3f088fad3 in ?? ()
91 LWP 30097 "rpc worker-3009" 0x00007fa3f088fad3 in ?? ()
92 LWP 30098 "rpc worker-3009" 0x00007fa3f088fad3 in ?? ()
93 LWP 30099 "rpc worker-3009" 0x00007fa3f088fad3 in ?? ()
94 LWP 30100 "rpc worker-3010" 0x00007fa3f088fad3 in ?? ()
95 LWP 30101 "rpc worker-3010" 0x00007fa3f088fad3 in ?? ()
96 LWP 30102 "rpc worker-3010" 0x00007fa3f088fad3 in ?? ()
97 LWP 30103 "rpc worker-3010" 0x00007fa3f088fad3 in ?? ()
98 LWP 30104 "rpc worker-3010" 0x00007fa3f088fad3 in ?? ()
99 LWP 30105 "rpc worker-3010" 0x00007fa3f088fad3 in ?? ()
100 LWP 30106 "rpc worker-3010" 0x00007fa3f088fad3 in ?? ()
101 LWP 30107 "rpc worker-3010" 0x00007fa3f088fad3 in ?? ()
102 LWP 30108 "rpc worker-3010" 0x00007fa3f088fad3 in ?? ()
103 LWP 30109 "rpc worker-3010" 0x00007fa3f088fad3 in ?? ()
104 LWP 30110 "rpc worker-3011" 0x00007fa3f088fad3 in ?? ()
105 LWP 30111 "rpc worker-3011" 0x00007fa3f088fad3 in ?? ()
106 LWP 30112 "rpc worker-3011" 0x00007fa3f088fad3 in ?? ()
107 LWP 30113 "rpc worker-3011" 0x00007fa3f088fad3 in ?? ()
108 LWP 30114 "rpc worker-3011" 0x00007fa3f088fad3 in ?? ()
109 LWP 30115 "rpc worker-3011" 0x00007fa3f088fad3 in ?? ()
110 LWP 30116 "rpc worker-3011" 0x00007fa3f088fad3 in ?? ()
111 LWP 30117 "rpc worker-3011" 0x00007fa3f088fad3 in ?? ()
112 LWP 30118 "rpc worker-3011" 0x00007fa3f088fad3 in ?? ()
113 LWP 30119 "rpc worker-3011" 0x00007fa3f088fad3 in ?? ()
114 LWP 30120 "rpc worker-3012" 0x00007fa3f088fad3 in ?? ()
115 LWP 30121 "rpc worker-3012" 0x00007fa3f088fad3 in ?? ()
116 LWP 30122 "rpc worker-3012" 0x00007fa3f088fad3 in ?? ()
117 LWP 30123 "diag-logger-301" 0x00007fa3f088ffb9 in ?? ()
118 LWP 30124 "result-tracker-" 0x00007fa3f088ffb9 in ?? ()
119 LWP 30125 "excess-log-dele" 0x00007fa3f088ffb9 in ?? ()
120 LWP 30126 "tcmalloc-memory" 0x00007fa3f088ffb9 in ?? ()
121 LWP 30127 "acceptor-30127" 0x00007fa3edbf8fc7 in ?? ()
122 LWP 30128 "heartbeat-30128" 0x00007fa3f088ffb9 in ?? ()
123 LWP 30129 "maintenance_sch" 0x00007fa3f088ffb9 in ?? ()
Thread 123 (LWP 30129):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3a6530330 in ?? ()
#2 0x0000000000000023 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560a19d2be60 in ?? ()
#5 0x00007fa3a6530350 in ?? ()
#6 0x0000000000000046 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 30128):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3a6d31310 in ?? ()
#2 0x000000000000000a in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560a19c7d640 in ?? ()
#5 0x00007fa3a6d31330 in ?? ()
#6 0x0000000000000014 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 121 (LWP 30127):
#0 0x00007fa3edbf8fc7 in ?? ()
#1 0x00007fa3a7532128 in ?? ()
#2 0x00007fa3a753212c in ?? ()
#3 0x00007fa3a7532130 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 120 (LWP 30126):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3a7d334f0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 119 (LWP 30125):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3a8534480 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffd7f3bc9a0 in ?? ()
#5 0x00007fa3a85344a0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 118 (LWP 30124):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3a8d354e0 in ?? ()
#2 0x0000000000000009 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560a19c36830 in ?? ()
#5 0x00007fa3a8d35500 in ?? ()
#6 0x0000000000000012 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 30123):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3a9536450 in ?? ()
#2 0x0000000000000009 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560a19f32be0 in ?? ()
#5 0x00007fa3a9536470 in ?? ()
#6 0x0000000000000012 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 30122):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000003 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560a19f4443c in ?? ()
#4 0x00007fa3a9d37430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fa3a9d37450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560a19f32a30 in ?? ()
#9 0x00007fa3f088f770 in ?? ()
#10 0x00007fa3a9d37450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 115 (LWP 30121):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 114 (LWP 30120):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 30119):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 30118):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 30117):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 30116):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 30115):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 30114):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 30113):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 30112):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 30111):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000005 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560a19f4428c in ?? ()
#4 0x00007fa3af542430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fa3af542450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560a19ef7d50 in ?? ()
#9 0x00007fa3f088f770 in ?? ()
#10 0x00007fa3af542450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 104 (LWP 30110):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 30109):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 30108):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 30107):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 30106):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 30105):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 30104):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 30103):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 30102):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 30101):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 30100):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 30099):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 30098):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 30097):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 30096):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 30095):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 30094):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 30093):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 30092):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 30091):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 30090):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 30089):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 30088):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 30087):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 30086):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 30085):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 30084):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 30083):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 30082):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 75 (LWP 30081):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 74 (LWP 30080):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 30079):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 30078):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 30077):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000094 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000560a19f40168 in ?? ()
#4 0x00007fa3c0564430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fa3c0564450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 70 (LWP 30076):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000085 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560a19f3dd8c in ?? ()
#4 0x00007fa3c0d65430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fa3c0d65450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560a19ef3570 in ?? ()
#9 0x00007fa3f088f770 in ?? ()
#10 0x00007fa3c0d65450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 69 (LWP 30075):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 30074):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 30073):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 30072):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 30071):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 30070):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 30069):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 30068):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 30067):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 30066):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 30065):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 30064):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 30063):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 30062):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 55 (LWP 30061):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 30060):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 30059):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 30058):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 30057):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000560a19f3b248 in ?? ()
#4 0x00007fa3ca578430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fa3ca578450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 50 (LWP 30056):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 30055):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 30054):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 30053):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 30052):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 30051):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 30050):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 30049):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 30048):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 30047):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 30046):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 30045):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 30044):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 30043):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 36 (LWP 30042):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560a19f36e5c in ?? ()
#4 0x00007fa3d1d87430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fa3d1d87450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560a19d40eb0 in ?? ()
#9 0x00007fa3f088f770 in ?? ()
#10 0x00007fa3d1d87450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 35 (LWP 30041):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x000000000000004b in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560a19f3a9dc in ?? ()
#4 0x00007fa3d2588430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fa3d2588450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560a19d40d90 in ?? ()
#9 0x00007fa3f088f770 in ?? ()
#10 0x00007fa3d2588450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 34 (LWP 30040):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560a19f3ad3c in ?? ()
#4 0x00007fa3d2d89430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fa3d2d89450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560a19d40c70 in ?? ()
#9 0x00007fa3f088f770 in ?? ()
#10 0x00007fa3d2d89450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 33 (LWP 30039):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560a19f3b09c in ?? ()
#4 0x00007fa3d358a430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fa3d358a450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560a19d40b50 in ?? ()
#9 0x00007fa3f088f770 in ?? ()
#10 0x00007fa3d358a450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 32 (LWP 30038):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 31 (LWP 30037):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 30 (LWP 30036):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 30035):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 30034):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 30033):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 30032):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 30031):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 30030):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 30029):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 30028):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 30027):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 30026):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 30025):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 30024):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 30023):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 30022):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3dbd9b390 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 15 (LWP 30021):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3dc59c540 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560a19ce1098 in ?? ()
#5 0x00007fa3dc59c560 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 30019):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3dd59e0c0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 13 (LWP 30018):
#0 0x00007fa3f088fad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 30016):
#0 0x00007fa3edbf7947 in ?? ()
#1 0x00007fa3de5a0580 in ?? ()
#2 0x00007fa3f0ec25c7 in ?? ()
#3 0x00007fa3de5a0580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 11 (LWP 30015):
#0 0x00007fa3edbf7947 in ?? ()
#1 0x00007fa3deda1580 in ?? ()
#2 0x00007fa3f0ec25c7 in ?? ()
#3 0x00007fa3deda1580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 10 (LWP 30014):
#0 0x00007fa3edbf7947 in ?? ()
#1 0x00007fa3df5a2580 in ?? ()
#2 0x00007fa3f0ec25c7 in ?? ()
#3 0x00007fa3df5a2580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 9 (LWP 30013):
#0 0x00007fa3edbf7947 in ?? ()
#1 0x00007fa3e1186580 in ?? ()
#2 0x00007fa3f0ec25c7 in ?? ()
#3 0x00007fa3e1186580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 8 (LWP 30010):
#0 0x00007fa3edbeabb9 in ?? ()
#1 0x00007fa3e2989800 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 30009):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3e21884e0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 30008):
#0 0x00007fa3f08939e2 in ?? ()
#1 0x00007fa3e1987330 in ?? ()
#2 0x00007fa3e1987380 in ?? ()
#3 0x00007fa3e1987350 in ?? ()
#4 0x0000000000000001 in ?? ()
#5 0x00007fa3e19876c0 in ?? ()
#6 0x0000560a19d0a7f0 in ?? ()
#7 0x0000560a19b94fd0 in ?? ()
#8 0x0000560a19b94fc0 in ?? ()
#9 0x00007ffd7f3baf70 in ?? ()
#10 0x00007fa3f13c9d72 in ?? ()
#11 0x0000560a19d08b30 in ?? ()
#12 0x0000560a19c362d0 in ?? ()
#13 0x00007fa3e1987390 in ?? ()
#14 0x00007fa3efd14040 in ?? ()
#15 0x00007fa300000010 in ?? ()
#16 0x00007fa3efe48f69 in ?? ()
#17 0x00007fa3e19873b0 in ?? ()
#18 0x0000560a19c19dd0 in ?? ()
#19 0x00007fa3efd14040 in ?? ()
#20 0x0000000000000000 in ?? ()
Thread 5 (LWP 30002):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3e398b2b0 in ?? ()
#2 0x000000000000002d in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560a19b952e8 in ?? ()
#5 0x00007fa3e398b2d0 in ?? ()
#6 0x000000000000005a in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 30001):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3e418c5c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560a19ce0438 in ?? ()
#5 0x00007fa3e418c5e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 30000):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3e498d5c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560a19ce02f8 in ?? ()
#5 0x00007fa3e498d5e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 29999):
#0 0x00007fa3f088ffb9 in ?? ()
#1 0x00007fa3e518e5c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560a19ce01b8 in ?? ()
#5 0x00007fa3e518e5e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 29997):
#0 0x00007fa3f0893d50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260430 02:02:57.780022 25189 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 3 with UUID fdcc2c1450744cb499d898b871318fa0 and pid 29730
************************ BEGIN STACKS **************************
[New LWP 29731]
[New LWP 29732]
[New LWP 29733]
[New LWP 29734]
[New LWP 29740]
[New LWP 29741]
[New LWP 29742]
[New LWP 29745]
[New LWP 29746]
[New LWP 29747]
[New LWP 29748]
[New LWP 29749]
[New LWP 29750]
[New LWP 29752]
[New LWP 29753]
[New LWP 29754]
[New LWP 29755]
[New LWP 29756]
[New LWP 29757]
[New LWP 29758]
[New LWP 29759]
[New LWP 29760]
[New LWP 29761]
[New LWP 29762]
[New LWP 29763]
[New LWP 29764]
[New LWP 29765]
[New LWP 29766]
[New LWP 29767]
[New LWP 29768]
[New LWP 29769]
[New LWP 29770]
[New LWP 29771]
[New LWP 29772]
[New LWP 29773]
[New LWP 29774]
[New LWP 29775]
[New LWP 29776]
[New LWP 29777]
[New LWP 29778]
[New LWP 29779]
[New LWP 29780]
[New LWP 29781]
[New LWP 29782]
[New LWP 29783]
[New LWP 29784]
[New LWP 29785]
[New LWP 29786]
[New LWP 29787]
[New LWP 29788]
[New LWP 29789]
[New LWP 29790]
[New LWP 29791]
[New LWP 29792]
[New LWP 29793]
[New LWP 29794]
[New LWP 29795]
[New LWP 29796]
[New LWP 29797]
[New LWP 29798]
[New LWP 29799]
[New LWP 29800]
[New LWP 29801]
[New LWP 29802]
[New LWP 29803]
[New LWP 29804]
[New LWP 29805]
[New LWP 29806]
[New LWP 29807]
[New LWP 29808]
[New LWP 29809]
[New LWP 29810]
[New LWP 29811]
[New LWP 29812]
[New LWP 29813]
[New LWP 29814]
[New LWP 29815]
[New LWP 29816]
[New LWP 29817]
[New LWP 29818]
[New LWP 29819]
[New LWP 29820]
[New LWP 29821]
[New LWP 29822]
[New LWP 29823]
[New LWP 29824]
[New LWP 29825]
[New LWP 29826]
[New LWP 29827]
[New LWP 29828]
[New LWP 29829]
[New LWP 29830]
[New LWP 29831]
[New LWP 29832]
[New LWP 29833]
[New LWP 29834]
[New LWP 29835]
[New LWP 29836]
[New LWP 29837]
[New LWP 29838]
[New LWP 29839]
[New LWP 29840]
[New LWP 29841]
[New LWP 29842]
[New LWP 29843]
[New LWP 29844]
[New LWP 29845]
[New LWP 29846]
[New LWP 29847]
[New LWP 29848]
[New LWP 29849]
[New LWP 29850]
[New LWP 29851]
[New LWP 29852]
[New LWP 29853]
[New LWP 29854]
[New LWP 29855]
[New LWP 29856]
[New LWP 29857]
[New LWP 29858]
[New LWP 29859]
[New LWP 29860]
[New LWP 30306]
0x00007fadeb899d50 in ?? ()
Id Target Id Frame
* 1 LWP 29730 "kudu" 0x00007fadeb899d50 in ?? ()
2 LWP 29731 "kudu" 0x00007fadeb895fb9 in ?? ()
3 LWP 29732 "kudu" 0x00007fadeb895fb9 in ?? ()
4 LWP 29733 "kudu" 0x00007fadeb895fb9 in ?? ()
5 LWP 29734 "kernel-watcher-" 0x00007fadeb895fb9 in ?? ()
6 LWP 29740 "ntp client-2974" 0x00007fadeb8999e2 in ?? ()
7 LWP 29741 "file cache-evic" 0x00007fadeb895fb9 in ?? ()
8 LWP 29742 "sq_acceptor" 0x00007fade8bf0bb9 in ?? ()
9 LWP 29745 "rpc reactor-297" 0x00007fade8bfd947 in ?? ()
10 LWP 29746 "rpc reactor-297" 0x00007fade8bfd947 in ?? ()
11 LWP 29747 "rpc reactor-297" 0x00007fade8bfd947 in ?? ()
12 LWP 29748 "rpc reactor-297" 0x00007fade8bfd947 in ?? ()
13 LWP 29749 "MaintenanceMgr " 0x00007fadeb895ad3 in ?? ()
14 LWP 29750 "txn-status-mana" 0x00007fadeb895fb9 in ?? ()
15 LWP 29752 "collect_and_rem" 0x00007fadeb895fb9 in ?? ()
16 LWP 29753 "tc-session-exp-" 0x00007fadeb895fb9 in ?? ()
17 LWP 29754 "rpc worker-2975" 0x00007fadeb895ad3 in ?? ()
18 LWP 29755 "rpc worker-2975" 0x00007fadeb895ad3 in ?? ()
19 LWP 29756 "rpc worker-2975" 0x00007fadeb895ad3 in ?? ()
20 LWP 29757 "rpc worker-2975" 0x00007fadeb895ad3 in ?? ()
21 LWP 29758 "rpc worker-2975" 0x00007fadeb895ad3 in ?? ()
22 LWP 29759 "rpc worker-2975" 0x00007fadeb895ad3 in ?? ()
23 LWP 29760 "rpc worker-2976" 0x00007fadeb895ad3 in ?? ()
24 LWP 29761 "rpc worker-2976" 0x00007fadeb895ad3 in ?? ()
25 LWP 29762 "rpc worker-2976" 0x00007fadeb895ad3 in ?? ()
26 LWP 29763 "rpc worker-2976" 0x00007fadeb895ad3 in ?? ()
27 LWP 29764 "rpc worker-2976" 0x00007fadeb895ad3 in ?? ()
28 LWP 29765 "rpc worker-2976" 0x00007fadeb895ad3 in ?? ()
29 LWP 29766 "rpc worker-2976" 0x00007fadeb895ad3 in ?? ()
30 LWP 29767 "rpc worker-2976" 0x00007fadeb895ad3 in ?? ()
31 LWP 29768 "rpc worker-2976" 0x00007fadeb895ad3 in ?? ()
32 LWP 29769 "rpc worker-2976" 0x00007fadeb895ad3 in ?? ()
33 LWP 29770 "rpc worker-2977" 0x00007fadeb895ad3 in ?? ()
34 LWP 29771 "rpc worker-2977" 0x00007fadeb895ad3 in ?? ()
35 LWP 29772 "rpc worker-2977" 0x00007fadeb895ad3 in ?? ()
36 LWP 29773 "rpc worker-2977" 0x00007fadeb895ad3 in ?? ()
37 LWP 29774 "rpc worker-2977" 0x00007fadeb895ad3 in ?? ()
38 LWP 29775 "rpc worker-2977" 0x00007fadeb895ad3 in ?? ()
39 LWP 29776 "rpc worker-2977" 0x00007fadeb895ad3 in ?? ()
40 LWP 29777 "rpc worker-2977" 0x00007fadeb895ad3 in ?? ()
41 LWP 29778 "rpc worker-2977" 0x00007fadeb895ad3 in ?? ()
42 LWP 29779 "rpc worker-2977" 0x00007fadeb895ad3 in ?? ()
43 LWP 29780 "rpc worker-2978" 0x00007fadeb895ad3 in ?? ()
44 LWP 29781 "rpc worker-2978" 0x00007fadeb895ad3 in ?? ()
45 LWP 29782 "rpc worker-2978" 0x00007fadeb895ad3 in ?? ()
46 LWP 29783 "rpc worker-2978" 0x00007fadeb895ad3 in ?? ()
47 LWP 29784 "rpc worker-2978" 0x00007fadeb895ad3 in ?? ()
48 LWP 29785 "rpc worker-2978" 0x00007fadeb895ad3 in ?? ()
49 LWP 29786 "rpc worker-2978" 0x00007fadeb895ad3 in ?? ()
50 LWP 29787 "rpc worker-2978" 0x00007fadeb895ad3 in ?? ()
51 LWP 29788 "rpc worker-2978" 0x00007fadeb895ad3 in ?? ()
52 LWP 29789 "rpc worker-2978" 0x00007fadeb895ad3 in ?? ()
53 LWP 29790 "rpc worker-2979" 0x00007fadeb895ad3 in ?? ()
54 LWP 29791 "rpc worker-2979" 0x00007fadeb895ad3 in ?? ()
55 LWP 29792 "rpc worker-2979" 0x00007fadeb895ad3 in ?? ()
56 LWP 29793 "rpc worker-2979" 0x00007fadeb895ad3 in ?? ()
57 LWP 29794 "rpc worker-2979" 0x00007fadeb895ad3 in ?? ()
58 LWP 29795 "rpc worker-2979" 0x00007fadeb895ad3 in ?? ()
59 LWP 29796 "rpc worker-2979" 0x00007fadeb895ad3 in ?? ()
60 LWP 29797 "rpc worker-2979" 0x00007fadeb895ad3 in ?? ()
61 LWP 29798 "rpc worker-2979" 0x00007fadeb895ad3 in ?? ()
62 LWP 29799 "rpc worker-2979" 0x00007fadeb895ad3 in ?? ()
63 LWP 29800 "rpc worker-2980" 0x00007fadeb895ad3 in ?? ()
64 LWP 29801 "rpc worker-2980" 0x00007fadeb895ad3 in ?? ()
65 LWP 29802 "rpc worker-2980" 0x00007fadeb895ad3 in ?? ()
66 LWP 29803 "rpc worker-2980" 0x00007fadeb895ad3 in ?? ()
67 LWP 29804 "rpc worker-2980" 0x00007fadeb895ad3 in ?? ()
68 LWP 29805 "rpc worker-2980" 0x00007fadeb895ad3 in ?? ()
69 LWP 29806 "rpc worker-2980" 0x00007fadeb895ad3 in ?? ()
70 LWP 29807 "rpc worker-2980" 0x00007fadeb895ad3 in ?? ()
71 LWP 29808 "rpc worker-2980" 0x00007fadeb895ad3 in ?? ()
72 LWP 29809 "rpc worker-2980" 0x00007fadeb895ad3 in ?? ()
73 LWP 29810 "rpc worker-2981" 0x00007fadeb895ad3 in ?? ()
74 LWP 29811 "rpc worker-2981" 0x00007fadeb895ad3 in ?? ()
75 LWP 29812 "rpc worker-2981" 0x00007fadeb895ad3 in ?? ()
76 LWP 29813 "rpc worker-2981" 0x00007fadeb895ad3 in ?? ()
77 LWP 29814 "rpc worker-2981" 0x00007fadeb895ad3 in ?? ()
78 LWP 29815 "rpc worker-2981" 0x00007fadeb895ad3 in ?? ()
79 LWP 29816 "rpc worker-2981" 0x00007fadeb895ad3 in ?? ()
80 LWP 29817 "rpc worker-2981" 0x00007fadeb895ad3 in ?? ()
81 LWP 29818 "rpc worker-2981" 0x00007fadeb895ad3 in ?? ()
82 LWP 29819 "rpc worker-2981" 0x00007fadeb895ad3 in ?? ()
83 LWP 29820 "rpc worker-2982" 0x00007fadeb895ad3 in ?? ()
84 LWP 29821 "rpc worker-2982" 0x00007fadeb895ad3 in ?? ()
85 LWP 29822 "rpc worker-2982" 0x00007fadeb895ad3 in ?? ()
86 LWP 29823 "rpc worker-2982" 0x00007fadeb895ad3 in ?? ()
87 LWP 29824 "rpc worker-2982" 0x00007fadeb895ad3 in ?? ()
88 LWP 29825 "rpc worker-2982" 0x00007fadeb895ad3 in ?? ()
89 LWP 29826 "rpc worker-2982" 0x00007fadeb895ad3 in ?? ()
90 LWP 29827 "rpc worker-2982" 0x00007fadeb895ad3 in ?? ()
91 LWP 29828 "rpc worker-2982" 0x00007fadeb895ad3 in ?? ()
92 LWP 29829 "rpc worker-2982" 0x00007fadeb895ad3 in ?? ()
93 LWP 29830 "rpc worker-2983" 0x00007fadeb895ad3 in ?? ()
94 LWP 29831 "rpc worker-2983" 0x00007fadeb895ad3 in ?? ()
95 LWP 29832 "rpc worker-2983" 0x00007fadeb895ad3 in ?? ()
96 LWP 29833 "rpc worker-2983" 0x00007fadeb895ad3 in ?? ()
97 LWP 29834 "rpc worker-2983" 0x00007fadeb895ad3 in ?? ()
98 LWP 29835 "rpc worker-2983" 0x00007fadeb895ad3 in ?? ()
99 LWP 29836 "rpc worker-2983" 0x00007fadeb895ad3 in ?? ()
100 LWP 29837 "rpc worker-2983" 0x00007fadeb895ad3 in ?? ()
101 LWP 29838 "rpc worker-2983" 0x00007fadeb895ad3 in ?? ()
102 LWP 29839 "rpc worker-2983" 0x00007fadeb895ad3 in ?? ()
103 LWP 29840 "rpc worker-2984" 0x00007fadeb895ad3 in ?? ()
104 LWP 29841 "rpc worker-2984" 0x00007fadeb895ad3 in ?? ()
105 LWP 29842 "rpc worker-2984" 0x00007fadeb895ad3 in ?? ()
106 LWP 29843 "rpc worker-2984" 0x00007fadeb895ad3 in ?? ()
107 LWP 29844 "rpc worker-2984" 0x00007fadeb895ad3 in ?? ()
108 LWP 29845 "rpc worker-2984" 0x00007fadeb895ad3 in ?? ()
109 LWP 29846 "rpc worker-2984" 0x00007fadeb895ad3 in ?? ()
110 LWP 29847 "rpc worker-2984" 0x00007fadeb895ad3 in ?? ()
111 LWP 29848 "rpc worker-2984" 0x00007fadeb895ad3 in ?? ()
112 LWP 29849 "rpc worker-2984" 0x00007fadeb895ad3 in ?? ()
113 LWP 29850 "rpc worker-2985" 0x00007fadeb895ad3 in ?? ()
114 LWP 29851 "rpc worker-2985" 0x00007fadeb895ad3 in ?? ()
115 LWP 29852 "rpc worker-2985" 0x00007fadeb895ad3 in ?? ()
116 LWP 29853 "rpc worker-2985" 0x00007fadeb895ad3 in ?? ()
117 LWP 29854 "diag-logger-298" 0x00007fadeb895fb9 in ?? ()
118 LWP 29855 "result-tracker-" 0x00007fadeb895fb9 in ?? ()
119 LWP 29856 "excess-log-dele" 0x00007fadeb895fb9 in ?? ()
120 LWP 29857 "tcmalloc-memory" 0x00007fadeb895fb9 in ?? ()
121 LWP 29858 "acceptor-29858" 0x00007fade8bfefc7 in ?? ()
122 LWP 29859 "heartbeat-29859" 0x00007fadeb895fb9 in ?? ()
123 LWP 29860 "maintenance_sch" 0x00007fadeb895fb9 in ?? ()
124 LWP 30306 "raft [worker]-3" 0x00007fadeb895fb9 in ?? ()
Thread 124 (LWP 30306):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fada0b2e310 in ?? ()
#2 0x0000000000000029 in ?? ()
#3 0x0000000100000081 in ?? ()
#4 0x00007fada0b2e6a4 in ?? ()
#5 0x00007fada0b2e330 in ?? ()
#6 0x0000000000000053 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x00007fada0b2e350 in ?? ()
#9 0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fada0b2e3f0 in ?? ()
#12 0x00007fada0b2e3e0 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 123 (LWP 29860):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fada132f330 in ?? ()
#2 0x0000000000000028 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055eb0c847e60 in ?? ()
#5 0x00007fada132f350 in ?? ()
#6 0x0000000000000050 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 29859):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fada1b30310 in ?? ()
#2 0x000000000000000c in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055eb0c799640 in ?? ()
#5 0x00007fada1b30330 in ?? ()
#6 0x0000000000000018 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 121 (LWP 29858):
#0 0x00007fade8bfefc7 in ?? ()
#1 0x00007fada2331128 in ?? ()
#2 0x00007fada233112c in ?? ()
#3 0x00007fada2331130 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 120 (LWP 29857):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fada2b324f0 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffdbb1956f0 in ?? ()
#5 0x00007fada2b32510 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 29856):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fada3333480 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 118 (LWP 29855):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fada3b344e0 in ?? ()
#2 0x000000000000000a in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055eb0c752830 in ?? ()
#5 0x00007fada3b34500 in ?? ()
#6 0x0000000000000014 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 29854):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fada4335450 in ?? ()
#2 0x000000000000000b in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055eb0ca5ebe0 in ?? ()
#5 0x00007fada4335470 in ?? ()
#6 0x0000000000000016 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 29853):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000007 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055eb0ca59d8c in ?? ()
#4 0x00007fada4b36430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fada4b36450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055eb0ca5ea30 in ?? ()
#9 0x00007fadeb895770 in ?? ()
#10 0x00007fada4b36450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 115 (LWP 29852):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055eb0ca59bdc in ?? ()
#4 0x00007fada5337430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fada5337450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055eb0ca5e910 in ?? ()
#9 0x00007fadeb895770 in ?? ()
#10 0x00007fada5337450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 114 (LWP 29851):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 29850):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 29849):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 29848):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 29847):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 29846):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 29845):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 29844):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 29843):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 29842):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 29841):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 29840):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 29839):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 29838):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 29837):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 29836):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 29835):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 29834):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 29833):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 29832):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 29831):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 29830):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 29829):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 29828):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 29827):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 29826):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 29825):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 29824):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 29823):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 29822):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 29821):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 29820):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 29819):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 29818):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 29817):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 29816):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 29815):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 29814):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 29813):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055eb0ca24af8 in ?? ()
#4 0x00007fadb8b5e430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fadb8b5e450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 75 (LWP 29812):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 74 (LWP 29811):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 29810):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 29809):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 29808):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 29807):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 29806):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 29805):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 29804):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 29803):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 29802):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 29801):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 29800):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 29799):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 29798):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 29797):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 29796):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 29795):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 29794):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 29793):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055eb0ca0f878 in ?? ()
#4 0x00007fadc2b72430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fadc2b72450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 55 (LWP 29792):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 29791):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 29790):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 29789):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 29788):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 29787):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 29786):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 29785):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 29784):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 29783):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 29782):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 29781):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 29780):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 29779):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 29778):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 29777):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 29776):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 29775):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 29774):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 36 (LWP 29773):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x000000000000032e in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055eb0ca0c5e8 in ?? ()
#4 0x00007fadccb86430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fadccb86450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 35 (LWP 29772):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x000000000000034f in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055eb0ca0c43c in ?? ()
#4 0x00007fadcd387430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fadcd387450 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055eb0ca00d90 in ?? ()
#9 0x00007fadeb895770 in ?? ()
#10 0x00007fadcd387450 in ?? ()
#11 0x0000000100000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 34 (LWP 29771):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x000000000000032c in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055eb0ca0c288 in ?? ()
#4 0x00007fadcdb88430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fadcdb88450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 33 (LWP 29770):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000110 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055eb0c85b098 in ?? ()
#4 0x00007fadce389430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fadce389450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 32 (LWP 29769):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x00000000000000fa in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055eb0ca0c048 in ?? ()
#4 0x00007fadceb8a430 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fadceb8a450 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 31 (LWP 29768):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 30 (LWP 29767):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 29766):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 29765):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 29764):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 29763):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 29762):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 29761):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 29760):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 29759):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 29758):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 29757):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 29756):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 29755):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 29754):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 29753):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fadd6b9a390 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055eb0c7994a0 in ?? ()
#5 0x00007fadd6b9a3b0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 15 (LWP 29752):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fadd739b540 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055eb0c7fd098 in ?? ()
#5 0x00007fadd739b560 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 29750):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fadd839d0c0 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055eb0c78f2a8 in ?? ()
#5 0x00007fadd839d0e0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 13 (LWP 29749):
#0 0x00007fadeb895ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 29748):
#0 0x00007fade8bfd947 in ?? ()
#1 0x00007fadd939f580 in ?? ()
#2 0x00007fadebec85c7 in ?? ()
#3 0x00007fadd939f580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 11 (LWP 29747):
#0 0x00007fade8bfd947 in ?? ()
#1 0x00007fadd9ba0580 in ?? ()
#2 0x00007fadebec85c7 in ?? ()
#3 0x00007fadd9ba0580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 10 (LWP 29746):
#0 0x00007fade8bfd947 in ?? ()
#1 0x00007fadda3a1580 in ?? ()
#2 0x00007fadebec85c7 in ?? ()
#3 0x00007fadda3a1580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 9 (LWP 29745):
#0 0x00007fade8bfd947 in ?? ()
#1 0x00007faddc18c580 in ?? ()
#2 0x00007fadebec85c7 in ?? ()
#3 0x00007faddc18c580 in ?? ()
#4 0x0000000000000000 in ?? ()
Thread 8 (LWP 29742):
#0 0x00007fade8bf0bb9 in ?? ()
#1 0x00007faddd98f800 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 29741):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007faddd18e4e0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 29740):
#0 0x00007fadeb8999e2 in ?? ()
#1 0x00007faddc98d330 in ?? ()
#2 0x00007faddc98d380 in ?? ()
#3 0x00007faddc98d350 in ?? ()
#4 0x0000000000000001 in ?? ()
#5 0x00007faddc98d6c0 in ?? ()
#6 0x000055eb0c8267f0 in ?? ()
#7 0x000055eb0c6b0fd0 in ?? ()
#8 0x000055eb0c6b0fc0 in ?? ()
#9 0x00007ffdbb193cc0 in ?? ()
#10 0x00007fadec3cfd72 in ?? ()
#11 0x000055eb0c824b90 in ?? ()
#12 0x000055eb0c7522d0 in ?? ()
#13 0x00007faddc98d390 in ?? ()
#14 0x00007fadead1a040 in ?? ()
#15 0x00007fad00000010 in ?? ()
#16 0x00007fadeae4ef69 in ?? ()
#17 0x00007faddc98d3b0 in ?? ()
#18 0x000055eb0c735dd0 in ?? ()
#19 0x00007fadead1a040 in ?? ()
#20 0x0000000000000000 in ?? ()
Thread 5 (LWP 29734):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fadde9912b0 in ?? ()
#2 0x0000000000000033 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055eb0c6b12e8 in ?? ()
#5 0x00007fadde9912d0 in ?? ()
#6 0x0000000000000066 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 29733):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007faddf1925c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055eb0c7fc438 in ?? ()
#5 0x00007faddf1925e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 29732):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007faddf9935c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055eb0c7fc2f8 in ?? ()
#5 0x00007faddf9935e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 29731):
#0 0x00007fadeb895fb9 in ?? ()
#1 0x00007fade01945c0 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055eb0c7fc1b8 in ?? ()
#5 0x00007fade01945e0 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 29730):
#0 0x00007fadeb899d50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260430 02:02:58.462647 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 29862
I20260430 02:02:58.470600 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 29598
I20260430 02:02:58.478353 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 29997
I20260430 02:02:58.487075 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 29730
I20260430 02:02:58.497853 25189 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskfXPN2o/build/debug/bin/kudu with pid 26341
2026-04-30T02:02:58Z chronyd exiting
I20260430 02:02:58.518707 25189 test_util.cc:182] -----------------------------------------------
I20260430 02:02:58.518841 25189 test_util.cc:183] Had failures, leaving test files at /tmp/dist-test-taskfXPN2o/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777514513829142-25189-0
[ FAILED ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-80 FC-5D A5-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-80 FC-5D A5-55 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-82 FC-5D A5-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00> (45093 ms)
[----------] 1 test from RollingRestartArgs/RollingRestartITest (45093 ms total)
[----------] Global test environment tear-down
[==========] 2 tests from 2 test suites ran. (64679 ms total)
[ PASSED ] 1 test.
[ FAILED ] 1 test, listed below:
[ FAILED ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-80 FC-5D A5-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-80 FC-5D A5-55 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-82 FC-5D A5-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00>
1 FAILED TEST
I20260430 02:02:58.519452 25189 logging.cc:424] LogThrottler /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/client/meta_cache.cc:302: suppressed but not reported on 28 messages since previous log ~10 seconds ago