Note: This is test shard 1 of 8.
[==========] Running 2 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 1 test from MaintenanceModeRF3ITest
[ RUN      ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate
2025-10-28T09:09:59Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-10-28T09:09:59Z Disabled control of system clock
WARNING: Logging before InitGoogleLogging() is written to STDERR
I20251028 09:09:59.059334  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.8.22.190:38971
--webserver_interface=127.8.22.190
--webserver_port=0
--builtin_ntp_servers=127.8.22.148:34983
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.8.22.190:38971 with env {}
W20251028 09:09:59.137524  8290 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:09:59.137701  8290 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:09:59.137719  8290 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:09:59.139171  8290 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20251028 09:09:59.139218  8290 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:09:59.139232  8290 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20251028 09:09:59.139245  8290 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20251028 09:09:59.140820  8290 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:34983
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.8.22.190:38971
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.8.22.190:38971
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.8.22.190
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.8290
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:09:59.141014  8290 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:09:59.141213  8290 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:09:59.144124  8296 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:09:59.144237  8290 server_base.cc:1047] running on GCE node
W20251028 09:09:59.144084  8295 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:09:59.144095  8298 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:09:59.144610  8290 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:09:59.144830  8290 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:09:59.145970  8290 hybrid_clock.cc:648] HybridClock initialized: now 1761642599145951 us; error 38 us; skew 500 ppm
I20251028 09:09:59.147253  8290 webserver.cc:492] Webserver started at http://127.8.22.190:35375/ using document root <none> and password file <none>
I20251028 09:09:59.147450  8290 fs_manager.cc:362] Metadata directory not provided
I20251028 09:09:59.147497  8290 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:09:59.147621  8290 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251028 09:09:59.148514  8290 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data/instance:
uuid: "f523d2f8a0724bf98f9f9645cb64b032"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.148814  8290 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal/instance:
uuid: "f523d2f8a0724bf98f9f9645cb64b032"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.149988  8290 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.003s
I20251028 09:09:59.150722  8304 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.150887  8290 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20251028 09:09:59.150978  8290 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal
uuid: "f523d2f8a0724bf98f9f9645cb64b032"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.151041  8290 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:09:59.164439  8290 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:09:59.164741  8290 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:09:59.164863  8290 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:09:59.168607  8290 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.190:38971
I20251028 09:09:59.168638  8356 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.190:38971 every 8 connection(s)
I20251028 09:09:59.168968  8290 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data/info.pb
I20251028 09:09:59.169566  8357 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.171985  8357 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: Bootstrap starting.
I20251028 09:09:59.172574  8357 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.172819  8357 log.cc:826] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: Log is configured to *not* fsync() on all Append() calls
I20251028 09:09:59.173269  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 8290
I20251028 09:09:59.173367  8282 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal/instance
I20251028 09:09:59.173458  8357 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: No bootstrap required, opened a new log
I20251028 09:09:59.175197  8357 raft_consensus.cc:359] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } }
I20251028 09:09:59.175372  8357 raft_consensus.cc:385] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.175405  8357 raft_consensus.cc:740] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f523d2f8a0724bf98f9f9645cb64b032, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.175508  8357 consensus_queue.cc:260] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } }
I20251028 09:09:59.175580  8357 raft_consensus.cc:399] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20251028 09:09:59.175616  8357 raft_consensus.cc:493] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20251028 09:09:59.175660  8357 raft_consensus.cc:3060] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.176352  8357 raft_consensus.cc:515] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } }
I20251028 09:09:59.176471  8357 leader_election.cc:304] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: f523d2f8a0724bf98f9f9645cb64b032; no voters: 
I20251028 09:09:59.176631  8357 leader_election.cc:290] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20251028 09:09:59.176687  8362 raft_consensus.cc:2804] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 1 FOLLOWER]: Leader election won for term 1
I20251028 09:09:59.176887  8362 raft_consensus.cc:697] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 1 LEADER]: Becoming Leader. State: Replica: f523d2f8a0724bf98f9f9645cb64b032, State: Running, Role: LEADER
I20251028 09:09:59.176976  8357 sys_catalog.cc:565] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [sys.catalog]: configured and running, proceeding with master startup.
I20251028 09:09:59.177038  8362 consensus_queue.cc:237] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } }
I20251028 09:09:59.177470  8364 sys_catalog.cc:455] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "f523d2f8a0724bf98f9f9645cb64b032" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } } }
I20251028 09:09:59.177539  8364 sys_catalog.cc:458] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [sys.catalog]: This master's current role is: LEADER
I20251028 09:09:59.177758  8363 sys_catalog.cc:455] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [sys.catalog]: SysCatalogTable state changed. Reason: New leader f523d2f8a0724bf98f9f9645cb64b032. Latest consensus state: current_term: 1 leader_uuid: "f523d2f8a0724bf98f9f9645cb64b032" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } } }
I20251028 09:09:59.177874  8363 sys_catalog.cc:458] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [sys.catalog]: This master's current role is: LEADER
I20251028 09:09:59.178198  8368 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20251028 09:09:59.178896  8368 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20251028 09:09:59.180449  8368 catalog_manager.cc:1357] Generated new cluster ID: c77fe8589d5c4c16b651a4133596eec9
I20251028 09:09:59.180505  8368 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20251028 09:09:59.197155  8368 catalog_manager.cc:1380] Generated new certificate authority record
I20251028 09:09:59.197697  8368 catalog_manager.cc:1514] Loading token signing keys...
I20251028 09:09:59.205463  8368 catalog_manager.cc:6022] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: Generated new TSK 0
I20251028 09:09:59.205658  8368 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20251028 09:09:59.209678  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.129:0
--local_ip_for_outbound_sockets=127.8.22.129
--webserver_interface=127.8.22.129
--webserver_port=0
--tserver_master_addrs=127.8.22.190:38971
--builtin_ntp_servers=127.8.22.148:34983
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20251028 09:09:59.297863  8381 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:09:59.298059  8381 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:09:59.298087  8381 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251028 09:09:59.298107  8381 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:09:59.299633  8381 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:09:59.299703  8381 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.129
I20251028 09:09:59.301262  8381 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:34983
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.129:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.8.22.129
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.8.22.190:38971
--never_fsync=true
--heap_profile_path=/tmp/kudu.8381
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.129
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:09:59.301527  8381 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:09:59.301764  8381 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:09:59.304322  8387 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:09:59.304425  8386 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:09:59.304564  8389 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:09:59.304862  8381 server_base.cc:1047] running on GCE node
I20251028 09:09:59.305056  8381 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:09:59.305259  8381 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:09:59.306403  8381 hybrid_clock.cc:648] HybridClock initialized: now 1761642599306387 us; error 36 us; skew 500 ppm
I20251028 09:09:59.307597  8381 webserver.cc:492] Webserver started at http://127.8.22.129:38231/ using document root <none> and password file <none>
I20251028 09:09:59.307808  8381 fs_manager.cc:362] Metadata directory not provided
I20251028 09:09:59.307847  8381 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:09:59.307974  8381 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251028 09:09:59.308838  8381 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data/instance:
uuid: "3d8ce816521247e18c0d5b9e130edc21"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.309193  8381 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal/instance:
uuid: "3d8ce816521247e18c0d5b9e130edc21"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.310405  8381 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.001s	sys 0.002s
I20251028 09:09:59.311209  8395 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.311401  8381 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:09:59.311499  8381 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal
uuid: "3d8ce816521247e18c0d5b9e130edc21"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.311578  8381 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:09:59.333067  8381 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:09:59.333389  8381 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:09:59.333523  8381 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:09:59.333789  8381 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:09:59.334174  8381 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:09:59.334224  8381 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.334275  8381 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:09:59.334306  8381 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.341269  8381 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.129:41547
I20251028 09:09:59.341327  8508 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.129:41547 every 8 connection(s)
I20251028 09:09:59.341652  8381 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
I20251028 09:09:59.344792  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 8381
I20251028 09:09:59.344870  8282 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal/instance
I20251028 09:09:59.346136  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.130:0
--local_ip_for_outbound_sockets=127.8.22.130
--webserver_interface=127.8.22.130
--webserver_port=0
--tserver_master_addrs=127.8.22.190:38971
--builtin_ntp_servers=127.8.22.148:34983
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20251028 09:09:59.347016  8509 heartbeater.cc:344] Connected to a master server at 127.8.22.190:38971
I20251028 09:09:59.347131  8509 heartbeater.cc:461] Registering TS with master...
I20251028 09:09:59.347332  8509 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:09:59.347975  8321 ts_manager.cc:194] Registered new tserver with Master: 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547)
I20251028 09:09:59.348961  8321 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.129:43559
W20251028 09:09:59.425881  8512 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:09:59.426064  8512 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:09:59.426095  8512 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251028 09:09:59.426115  8512 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:09:59.427554  8512 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:09:59.427618  8512 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.130
I20251028 09:09:59.429142  8512 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:34983
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.130:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.8.22.130
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.8.22.190:38971
--never_fsync=true
--heap_profile_path=/tmp/kudu.8512
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.130
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:09:59.429386  8512 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:09:59.429612  8512 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:09:59.432145  8520 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:09:59.432158  8518 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:09:59.432093  8517 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:09:59.432559  8512 server_base.cc:1047] running on GCE node
I20251028 09:09:59.432739  8512 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:09:59.432955  8512 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:09:59.434113  8512 hybrid_clock.cc:648] HybridClock initialized: now 1761642599434103 us; error 38 us; skew 500 ppm
I20251028 09:09:59.435300  8512 webserver.cc:492] Webserver started at http://127.8.22.130:44555/ using document root <none> and password file <none>
I20251028 09:09:59.435508  8512 fs_manager.cc:362] Metadata directory not provided
I20251028 09:09:59.435551  8512 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:09:59.435669  8512 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251028 09:09:59.436491  8512 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/data/instance:
uuid: "5cf59e115bd641f8965613ab616c9b85"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.436796  8512 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/wal/instance:
uuid: "5cf59e115bd641f8965613ab616c9b85"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.438023  8512 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.000s
I20251028 09:09:59.438782  8526 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.439024  8512 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:09:59.439101  8512 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/wal
uuid: "5cf59e115bd641f8965613ab616c9b85"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.439164  8512 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:09:59.449283  8512 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:09:59.449542  8512 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:09:59.449656  8512 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:09:59.449847  8512 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:09:59.450171  8512 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:09:59.450206  8512 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.450243  8512 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:09:59.450274  8512 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.456061  8512 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.130:46821
I20251028 09:09:59.456116  8639 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.130:46821 every 8 connection(s)
I20251028 09:09:59.456410  8512 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
I20251028 09:09:59.460667  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 8512
I20251028 09:09:59.460748  8282 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-1/wal/instance
I20251028 09:09:59.460707  8640 heartbeater.cc:344] Connected to a master server at 127.8.22.190:38971
I20251028 09:09:59.460808  8640 heartbeater.cc:461] Registering TS with master...
I20251028 09:09:59.460999  8640 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:09:59.461526  8321 ts_manager.cc:194] Registered new tserver with Master: 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:09:59.461867  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.131:0
--local_ip_for_outbound_sockets=127.8.22.131
--webserver_interface=127.8.22.131
--webserver_port=0
--tserver_master_addrs=127.8.22.190:38971
--builtin_ntp_servers=127.8.22.148:34983
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20251028 09:09:59.462040  8321 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.130:40299
W20251028 09:09:59.543531  8643 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:09:59.543697  8643 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:09:59.543717  8643 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251028 09:09:59.543731  8643 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:09:59.545131  8643 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:09:59.545184  8643 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.131
I20251028 09:09:59.546639  8643 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:34983
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.131:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.8.22.131
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.8.22.190:38971
--never_fsync=true
--heap_profile_path=/tmp/kudu.8643
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.131
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:09:59.546826  8643 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:09:59.547096  8643 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:09:59.549793  8648 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:09:59.549767  8649 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:09:59.550004  8643 server_base.cc:1047] running on GCE node
W20251028 09:09:59.549784  8651 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:09:59.550231  8643 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:09:59.550453  8643 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:09:59.551585  8643 hybrid_clock.cc:648] HybridClock initialized: now 1761642599551563 us; error 36 us; skew 500 ppm
I20251028 09:09:59.552807  8643 webserver.cc:492] Webserver started at http://127.8.22.131:45915/ using document root <none> and password file <none>
I20251028 09:09:59.553025  8643 fs_manager.cc:362] Metadata directory not provided
I20251028 09:09:59.553069  8643 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:09:59.553184  8643 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251028 09:09:59.554052  8643 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/data/instance:
uuid: "a73a6d78104342d7bfeb363acfac482b"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.554384  8643 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/wal/instance:
uuid: "a73a6d78104342d7bfeb363acfac482b"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.555676  8643 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.003s
I20251028 09:09:59.556496  8657 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.556679  8643 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:09:59.556738  8643 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/wal
uuid: "a73a6d78104342d7bfeb363acfac482b"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.556804  8643 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:09:59.572232  8643 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:09:59.572495  8643 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:09:59.572621  8643 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:09:59.572824  8643 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:09:59.573133  8643 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:09:59.573164  8643 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.573213  8643 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:09:59.573244  8643 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.579346  8643 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.131:38055
I20251028 09:09:59.579396  8770 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.131:38055 every 8 connection(s)
I20251028 09:09:59.579694  8643 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
I20251028 09:09:59.584169  8771 heartbeater.cc:344] Connected to a master server at 127.8.22.190:38971
I20251028 09:09:59.584275  8771 heartbeater.cc:461] Registering TS with master...
I20251028 09:09:59.584486  8771 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:09:59.584848  8321 ts_manager.cc:194] Registered new tserver with Master: a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:09:59.585227  8321 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.131:57747
I20251028 09:09:59.586226  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 8643
I20251028 09:09:59.586303  8282 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-2/wal/instance
I20251028 09:09:59.587332  8282 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20251028 09:09:59.593091  8282 test_util.cc:276] Using random seed: 1633596543
I20251028 09:09:59.599371  8321 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:33222:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
  rows: "<redacted>""\004\001\000UUU\025\004\001\000\252\252\252*\004\001\000\377\377\377?\004\001\000TUUU\004\001\000\251\252\252j"
  indirect_data: "<redacted>"""
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20251028 09:09:59.599704  8321 catalog_manager.cc:7011] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-workload in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20251028 09:09:59.606801  8701 tablet_service.cc:1505] Processing CreateTablet for tablet 386ef2b749f24de9aac6947b29d3479c (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20251028 09:09:59.606822  8440 tablet_service.cc:1505] Processing CreateTablet for tablet 61b94d4520cd4af999e8e38b3a475ac4 (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20251028 09:09:59.607071  8438 tablet_service.cc:1505] Processing CreateTablet for tablet 189fdd871f304eb888387f1f15973390 (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20251028 09:09:59.607062  8701 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 386ef2b749f24de9aac6947b29d3479c. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.607204  8440 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 61b94d4520cd4af999e8e38b3a475ac4. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.606876  8441 tablet_service.cc:1505] Processing CreateTablet for tablet 9e00a8dbf00b4468bbbfc193e444299e (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20251028 09:09:59.609059  8441 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9e00a8dbf00b4468bbbfc193e444299e. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.609109  8790 tablet_bootstrap.cc:492] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b: Bootstrap starting.
I20251028 09:09:59.609179  8698 tablet_service.cc:1505] Processing CreateTablet for tablet 189fdd871f304eb888387f1f15973390 (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20251028 09:09:59.609269  8698 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 189fdd871f304eb888387f1f15973390. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.609818  8790 tablet_bootstrap.cc:654] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.610118  8790 log.cc:826] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b: Log is configured to *not* fsync() on all Append() calls
I20251028 09:09:59.606995  8439 tablet_service.cc:1505] Processing CreateTablet for tablet 386ef2b749f24de9aac6947b29d3479c (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20251028 09:09:59.610335  8439 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 386ef2b749f24de9aac6947b29d3479c. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.610801  8790 tablet_bootstrap.cc:492] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b: No bootstrap required, opened a new log
I20251028 09:09:59.610868  8790 ts_tablet_manager.cc:1403] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b: Time spent bootstrapping tablet: real 0.002s	user 0.002s	sys 0.000s
I20251028 09:09:59.611032  8572 tablet_service.cc:1505] Processing CreateTablet for tablet 9e00a8dbf00b4468bbbfc193e444299e (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20251028 09:09:59.611282  8570 tablet_service.cc:1505] Processing CreateTablet for tablet 386ef2b749f24de9aac6947b29d3479c (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20251028 09:09:59.611372  8572 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9e00a8dbf00b4468bbbfc193e444299e. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.611428  8569 tablet_service.cc:1505] Processing CreateTablet for tablet 189fdd871f304eb888387f1f15973390 (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20251028 09:09:59.611492  8569 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 189fdd871f304eb888387f1f15973390. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.611032  8571 tablet_service.cc:1505] Processing CreateTablet for tablet 61b94d4520cd4af999e8e38b3a475ac4 (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20251028 09:09:59.611616  8573 tablet_service.cc:1505] Processing CreateTablet for tablet bbe94a484b0c4198bb904ffa7d3ef621 (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20251028 09:09:59.611691  8571 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 61b94d4520cd4af999e8e38b3a475ac4. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.611049  8574 tablet_service.cc:1505] Processing CreateTablet for tablet a208b359e1a44c018a8ad71c60ff8fed (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION VALUES < 357913941
I20251028 09:09:59.611790  8438 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 189fdd871f304eb888387f1f15973390. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.611833  8574 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a208b359e1a44c018a8ad71c60ff8fed. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.612192  8703 tablet_service.cc:1505] Processing CreateTablet for tablet 9e00a8dbf00b4468bbbfc193e444299e (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20251028 09:09:59.612561  8703 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9e00a8dbf00b4468bbbfc193e444299e. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.612614  8790 raft_consensus.cc:359] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.612756  8790 raft_consensus.cc:385] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.612787  8790 raft_consensus.cc:740] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a73a6d78104342d7bfeb363acfac482b, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.606827  8442 tablet_service.cc:1505] Processing CreateTablet for tablet bbe94a484b0c4198bb904ffa7d3ef621 (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20251028 09:09:59.612881  8790 consensus_queue.cc:260] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.612977  8442 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet bbe94a484b0c4198bb904ffa7d3ef621. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.613159  8790 ts_tablet_manager.cc:1434] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251028 09:09:59.613291  8790 tablet_bootstrap.cc:492] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b: Bootstrap starting.
I20251028 09:09:59.613714  8790 tablet_bootstrap.cc:654] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.613760  8771 heartbeater.cc:499] Master 127.8.22.190:38971 was elected leader, sending a full tablet report...
I20251028 09:09:59.613973  8570 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 386ef2b749f24de9aac6947b29d3479c. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.614259  8573 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet bbe94a484b0c4198bb904ffa7d3ef621. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.615098  8443 tablet_service.cc:1505] Processing CreateTablet for tablet a208b359e1a44c018a8ad71c60ff8fed (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION VALUES < 357913941
I20251028 09:09:59.615197  8443 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a208b359e1a44c018a8ad71c60ff8fed. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.606613  8702 tablet_service.cc:1505] Processing CreateTablet for tablet 61b94d4520cd4af999e8e38b3a475ac4 (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20251028 09:09:59.615778  8702 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 61b94d4520cd4af999e8e38b3a475ac4. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.616618  8791 tablet_bootstrap.cc:492] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
I20251028 09:09:59.606635  8704 tablet_service.cc:1505] Processing CreateTablet for tablet bbe94a484b0c4198bb904ffa7d3ef621 (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20251028 09:09:59.617062  8704 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet bbe94a484b0c4198bb904ffa7d3ef621. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.617417  8791 tablet_bootstrap.cc:654] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.617668  8791 log.cc:826] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21: Log is configured to *not* fsync() on all Append() calls
I20251028 09:09:59.606613  8705 tablet_service.cc:1505] Processing CreateTablet for tablet a208b359e1a44c018a8ad71c60ff8fed (DEFAULT_TABLE table=test-workload [id=b87c889f1b21497693f06a41b4761586]), partition=RANGE (key) PARTITION VALUES < 357913941
I20251028 09:09:59.617915  8705 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a208b359e1a44c018a8ad71c60ff8fed. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:09:59.618273  8791 tablet_bootstrap.cc:492] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21: No bootstrap required, opened a new log
I20251028 09:09:59.618317  8791 ts_tablet_manager.cc:1403] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20251028 09:09:59.619978  8791 raft_consensus.cc:359] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.620138  8791 raft_consensus.cc:385] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.620187  8791 raft_consensus.cc:740] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.620291  8791 consensus_queue.cc:260] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.620483  8794 tablet_bootstrap.cc:492] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85: Bootstrap starting.
I20251028 09:09:59.620522  8790 tablet_bootstrap.cc:492] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b: No bootstrap required, opened a new log
I20251028 09:09:59.620569  8790 ts_tablet_manager.cc:1403] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b: Time spent bootstrapping tablet: real 0.007s	user 0.001s	sys 0.000s
I20251028 09:09:59.620573  8791 ts_tablet_manager.cc:1434] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251028 09:09:59.620642  8509 heartbeater.cc:499] Master 127.8.22.190:38971 was elected leader, sending a full tablet report...
I20251028 09:09:59.620658  8791 tablet_bootstrap.cc:492] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
I20251028 09:09:59.620725  8790 raft_consensus.cc:359] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.620785  8790 raft_consensus.cc:385] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.620803  8790 raft_consensus.cc:740] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a73a6d78104342d7bfeb363acfac482b, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.620847  8790 consensus_queue.cc:260] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.620983  8790 ts_tablet_manager.cc:1434] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.621052  8790 tablet_bootstrap.cc:492] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b: Bootstrap starting.
I20251028 09:09:59.621145  8791 tablet_bootstrap.cc:654] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.621167  8794 tablet_bootstrap.cc:654] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.621483  8790 tablet_bootstrap.cc:654] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.621692  8794 log.cc:826] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85: Log is configured to *not* fsync() on all Append() calls
I20251028 09:09:59.621778  8791 tablet_bootstrap.cc:492] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21: No bootstrap required, opened a new log
I20251028 09:09:59.621812  8791 ts_tablet_manager.cc:1403] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:09:59.621966  8791 raft_consensus.cc:359] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.622025  8791 raft_consensus.cc:385] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.622051  8791 raft_consensus.cc:740] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.622058  8790 tablet_bootstrap.cc:492] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b: No bootstrap required, opened a new log
I20251028 09:09:59.622095  8790 ts_tablet_manager.cc:1403] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:09:59.622090  8791 consensus_queue.cc:260] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.622191  8791 ts_tablet_manager.cc:1434] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.000s	user 0.001s	sys 0.000s
I20251028 09:09:59.622257  8791 tablet_bootstrap.cc:492] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
I20251028 09:09:59.622257  8790 raft_consensus.cc:359] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.622326  8790 raft_consensus.cc:385] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.622349  8790 raft_consensus.cc:740] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a73a6d78104342d7bfeb363acfac482b, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.622388  8790 consensus_queue.cc:260] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.622465  8790 ts_tablet_manager.cc:1434] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.622522  8790 tablet_bootstrap.cc:492] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b: Bootstrap starting.
I20251028 09:09:59.622668  8791 tablet_bootstrap.cc:654] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.623183  8790 tablet_bootstrap.cc:654] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.623441  8791 tablet_bootstrap.cc:492] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21: No bootstrap required, opened a new log
I20251028 09:09:59.623481  8791 ts_tablet_manager.cc:1403] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:09:59.623772  8790 tablet_bootstrap.cc:492] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b: No bootstrap required, opened a new log
I20251028 09:09:59.623754  8791 raft_consensus.cc:359] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.623821  8791 raft_consensus.cc:385] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.623855  8791 raft_consensus.cc:740] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.623883  8790 ts_tablet_manager.cc:1403] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:09:59.623919  8791 consensus_queue.cc:260] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.624017  8791 ts_tablet_manager.cc:1434] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:09:59.624079  8791 tablet_bootstrap.cc:492] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
I20251028 09:09:59.624076  8790 raft_consensus.cc:359] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.624145  8790 raft_consensus.cc:385] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.624169  8790 raft_consensus.cc:740] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a73a6d78104342d7bfeb363acfac482b, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.624236  8790 consensus_queue.cc:260] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.624321  8790 ts_tablet_manager.cc:1434] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.624398  8790 tablet_bootstrap.cc:492] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b: Bootstrap starting.
I20251028 09:09:59.624497  8791 tablet_bootstrap.cc:654] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.624845  8790 tablet_bootstrap.cc:654] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.625140  8791 tablet_bootstrap.cc:492] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21: No bootstrap required, opened a new log
I20251028 09:09:59.625190  8791 ts_tablet_manager.cc:1403] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:09:59.625360  8791 raft_consensus.cc:359] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.625418  8791 raft_consensus.cc:385] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.625438  8791 raft_consensus.cc:740] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.625468  8791 consensus_queue.cc:260] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.625479  8794 tablet_bootstrap.cc:492] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85: No bootstrap required, opened a new log
I20251028 09:09:59.625551  8794 ts_tablet_manager.cc:1403] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85: Time spent bootstrapping tablet: real 0.005s	user 0.001s	sys 0.000s
I20251028 09:09:59.625563  8791 ts_tablet_manager.cc:1434] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.625617  8791 tablet_bootstrap.cc:492] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
I20251028 09:09:59.625880  8790 tablet_bootstrap.cc:492] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b: No bootstrap required, opened a new log
I20251028 09:09:59.625931  8790 ts_tablet_manager.cc:1403] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20251028 09:09:59.626087  8791 tablet_bootstrap.cc:654] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.626073  8790 raft_consensus.cc:359] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.626125  8790 raft_consensus.cc:385] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.626144  8790 raft_consensus.cc:740] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a73a6d78104342d7bfeb363acfac482b, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.626183  8790 consensus_queue.cc:260] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.626261  8790 ts_tablet_manager.cc:1434] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.626320  8790 tablet_bootstrap.cc:492] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b: Bootstrap starting.
I20251028 09:09:59.626722  8790 tablet_bootstrap.cc:654] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.627099  8791 tablet_bootstrap.cc:492] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21: No bootstrap required, opened a new log
I20251028 09:09:59.627156  8791 ts_tablet_manager.cc:1403] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.002s	user 0.000s	sys 0.001s
I20251028 09:09:59.627318  8791 raft_consensus.cc:359] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.627377  8791 raft_consensus.cc:385] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.627401  8791 raft_consensus.cc:740] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.627343  8794 raft_consensus.cc:359] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:09:59.627446  8791 consensus_queue.cc:260] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.627466  8790 tablet_bootstrap.cc:492] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b: No bootstrap required, opened a new log
I20251028 09:09:59.627506  8790 ts_tablet_manager.cc:1403] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b: Time spent bootstrapping tablet: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:09:59.627498  8794 raft_consensus.cc:385] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.627538  8794 raft_consensus.cc:740] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5cf59e115bd641f8965613ab616c9b85, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.627550  8791 ts_tablet_manager.cc:1434] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.001s
I20251028 09:09:59.627612  8791 tablet_bootstrap.cc:492] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
I20251028 09:09:59.627774  8790 raft_consensus.cc:359] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:09:59.627843  8790 raft_consensus.cc:385] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.627871  8790 raft_consensus.cc:740] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a73a6d78104342d7bfeb363acfac482b, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.627789  8794 consensus_queue.cc:260] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:09:59.627933  8790 consensus_queue.cc:260] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:09:59.628058  8791 tablet_bootstrap.cc:654] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.628070  8790 ts_tablet_manager.cc:1434] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:09:59.628288  8794 ts_tablet_manager.cc:1434] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85: Time spent starting tablet: real 0.003s	user 0.003s	sys 0.000s
I20251028 09:09:59.628367  8640 heartbeater.cc:499] Master 127.8.22.190:38971 was elected leader, sending a full tablet report...
I20251028 09:09:59.628386  8794 tablet_bootstrap.cc:492] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85: Bootstrap starting.
I20251028 09:09:59.628865  8794 tablet_bootstrap.cc:654] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.628909  8791 tablet_bootstrap.cc:492] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21: No bootstrap required, opened a new log
I20251028 09:09:59.628962  8791 ts_tablet_manager.cc:1403] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:09:59.629120  8791 raft_consensus.cc:359] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:09:59.629187  8791 raft_consensus.cc:385] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.629217  8791 raft_consensus.cc:740] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.629487  8794 tablet_bootstrap.cc:492] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85: No bootstrap required, opened a new log
I20251028 09:09:59.629473  8791 consensus_queue.cc:260] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:09:59.629534  8794 ts_tablet_manager.cc:1403] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:09:59.629604  8791 ts_tablet_manager.cc:1434] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:09:59.629703  8794 raft_consensus.cc:359] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.629769  8794 raft_consensus.cc:385] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.629789  8794 raft_consensus.cc:740] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5cf59e115bd641f8965613ab616c9b85, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.629850  8794 consensus_queue.cc:260] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.629995  8794 ts_tablet_manager.cc:1434] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.630064  8794 tablet_bootstrap.cc:492] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85: Bootstrap starting.
I20251028 09:09:59.630509  8794 tablet_bootstrap.cc:654] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.631088  8794 tablet_bootstrap.cc:492] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85: No bootstrap required, opened a new log
I20251028 09:09:59.631130  8794 ts_tablet_manager.cc:1403] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:09:59.631278  8794 raft_consensus.cc:359] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.631326  8794 raft_consensus.cc:385] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.631340  8794 raft_consensus.cc:740] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5cf59e115bd641f8965613ab616c9b85, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.631388  8794 consensus_queue.cc:260] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.631476  8794 ts_tablet_manager.cc:1434] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.631531  8794 tablet_bootstrap.cc:492] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85: Bootstrap starting.
I20251028 09:09:59.632136  8794 tablet_bootstrap.cc:654] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.632750  8794 tablet_bootstrap.cc:492] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85: No bootstrap required, opened a new log
I20251028 09:09:59.632789  8794 ts_tablet_manager.cc:1403] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:09:59.632896  8794 raft_consensus.cc:359] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.632937  8794 raft_consensus.cc:385] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.632951  8794 raft_consensus.cc:740] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5cf59e115bd641f8965613ab616c9b85, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.632984  8794 consensus_queue.cc:260] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.633074  8794 ts_tablet_manager.cc:1434] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.633131  8794 tablet_bootstrap.cc:492] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85: Bootstrap starting.
I20251028 09:09:59.633545  8794 tablet_bootstrap.cc:654] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.634037  8794 tablet_bootstrap.cc:492] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85: No bootstrap required, opened a new log
I20251028 09:09:59.634073  8794 ts_tablet_manager.cc:1403] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:09:59.634176  8794 raft_consensus.cc:359] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.634225  8794 raft_consensus.cc:385] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.634239  8794 raft_consensus.cc:740] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5cf59e115bd641f8965613ab616c9b85, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.634270  8794 consensus_queue.cc:260] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.634377  8794 ts_tablet_manager.cc:1434] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.634435  8794 tablet_bootstrap.cc:492] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85: Bootstrap starting.
I20251028 09:09:59.634814  8794 tablet_bootstrap.cc:654] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85: Neither blocks nor log segments found. Creating new log.
I20251028 09:09:59.635469  8794 tablet_bootstrap.cc:492] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85: No bootstrap required, opened a new log
I20251028 09:09:59.635509  8794 ts_tablet_manager.cc:1403] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:09:59.635636  8794 raft_consensus.cc:359] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.635686  8794 raft_consensus.cc:385] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:09:59.635712  8794 raft_consensus.cc:740] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5cf59e115bd641f8965613ab616c9b85, State: Initialized, Role: FOLLOWER
I20251028 09:09:59.635762  8794 consensus_queue.cc:260] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.635843  8794 ts_tablet_manager.cc:1434] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.642758  8797 raft_consensus.cc:493] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:09:59.642891  8797 raft_consensus.cc:515] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.643255  8797 leader_election.cc:290] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:09:59.645924  8594 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "386ef2b749f24de9aac6947b29d3479c" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85" is_pre_election: true
I20251028 09:09:59.646088  8594 raft_consensus.cc:2468] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 0.
I20251028 09:09:59.646080  8725 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "386ef2b749f24de9aac6947b29d3479c" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b" is_pre_election: true
I20251028 09:09:59.646286  8725 raft_consensus.cc:2468] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 0.
I20251028 09:09:59.646306  8399 leader_election.cc:304] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3d8ce816521247e18c0d5b9e130edc21, 5cf59e115bd641f8965613ab616c9b85; no voters: 
I20251028 09:09:59.646427  8797 raft_consensus.cc:2804] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251028 09:09:59.646486  8797 raft_consensus.cc:493] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:09:59.646522  8797 raft_consensus.cc:3060] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.647212  8797 raft_consensus.cc:515] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.647346  8797 leader_election.cc:290] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 election: Requested vote from peers 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:09:59.647507  8594 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "386ef2b749f24de9aac6947b29d3479c" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85"
I20251028 09:09:59.647522  8725 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "386ef2b749f24de9aac6947b29d3479c" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b"
I20251028 09:09:59.647590  8594 raft_consensus.cc:3060] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.647599  8725 raft_consensus.cc:3060] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.647781  8807 raft_consensus.cc:493] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:09:59.647856  8807 raft_consensus.cc:515] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.648566  8725 raft_consensus.cc:2468] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 1.
I20251028 09:09:59.648751  8396 leader_election.cc:304] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3d8ce816521247e18c0d5b9e130edc21, a73a6d78104342d7bfeb363acfac482b; no voters: 
I20251028 09:09:59.648850  8797 raft_consensus.cc:2804] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Leader election won for term 1
I20251028 09:09:59.648880  8807 leader_election.cc:290] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:09:59.648928  8594 raft_consensus.cc:2468] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 1.
I20251028 09:09:59.648968  8797 raft_consensus.cc:697] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 1 LEADER]: Becoming Leader. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Running, Role: LEADER
I20251028 09:09:59.649070  8797 consensus_queue.cc:237] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.649858  8321 catalog_manager.cc:5649] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 reported cstate change: term changed from 0 to 1, leader changed from <none> to 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129). New cstate: current_term: 1 leader_uuid: "3d8ce816521247e18c0d5b9e130edc21" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: UNKNOWN } } }
I20251028 09:09:59.650812  8797 raft_consensus.cc:493] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:09:59.650878  8797 raft_consensus.cc:515] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:09:59.651037  8797 leader_election.cc:290] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055), 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:09:59.651248  8725 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a208b359e1a44c018a8ad71c60ff8fed" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b" is_pre_election: true
I20251028 09:09:59.651316  8725 raft_consensus.cc:2468] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 0.
I20251028 09:09:59.651289  8594 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a208b359e1a44c018a8ad71c60ff8fed" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85" is_pre_election: true
I20251028 09:09:59.651355  8594 raft_consensus.cc:2468] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 0.
I20251028 09:09:59.651435  8396 leader_election.cc:304] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3d8ce816521247e18c0d5b9e130edc21, a73a6d78104342d7bfeb363acfac482b; no voters: 
I20251028 09:09:59.651537  8797 raft_consensus.cc:2804] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251028 09:09:59.651576  8797 raft_consensus.cc:493] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:09:59.651597  8797 raft_consensus.cc:3060] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.651859  8725 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9e00a8dbf00b4468bbbfc193e444299e" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b" is_pre_election: true
I20251028 09:09:59.651918  8725 raft_consensus.cc:2468] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 5cf59e115bd641f8965613ab616c9b85 in term 0.
I20251028 09:09:59.652065  8527 leader_election.cc:304] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 
I20251028 09:09:59.652268  8797 raft_consensus.cc:515] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:09:59.652393  8797 leader_election.cc:290] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 election: Requested vote from peers a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055), 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:09:59.652521  8594 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a208b359e1a44c018a8ad71c60ff8fed" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85"
I20251028 09:09:59.652575  8807 raft_consensus.cc:2804] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251028 09:09:59.652602  8725 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a208b359e1a44c018a8ad71c60ff8fed" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b"
I20251028 09:09:59.652582  8594 raft_consensus.cc:3060] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.652652  8725 raft_consensus.cc:3060] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.652655  8807 raft_consensus.cc:493] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:09:59.652683  8807 raft_consensus.cc:3060] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.653281  8807 raft_consensus.cc:515] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.653295  8725 raft_consensus.cc:2468] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 1.
I20251028 09:09:59.653359  8594 raft_consensus.cc:2468] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 1.
I20251028 09:09:59.653388  8807 leader_election.cc:290] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 1 election: Requested vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:09:59.653456  8396 leader_election.cc:304] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3d8ce816521247e18c0d5b9e130edc21, a73a6d78104342d7bfeb363acfac482b; no voters: 
I20251028 09:09:59.653534  8797 raft_consensus.cc:2804] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Leader election won for term 1
I20251028 09:09:59.653569  8797 raft_consensus.cc:697] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 1 LEADER]: Becoming Leader. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Running, Role: LEADER
I20251028 09:09:59.653612  8797 consensus_queue.cc:237] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:09:59.653738  8725 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9e00a8dbf00b4468bbbfc193e444299e" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b"
I20251028 09:09:59.653795  8725 raft_consensus.cc:3060] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.653879  8462 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9e00a8dbf00b4468bbbfc193e444299e" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3d8ce816521247e18c0d5b9e130edc21"
I20251028 09:09:59.653885  8463 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9e00a8dbf00b4468bbbfc193e444299e" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3d8ce816521247e18c0d5b9e130edc21" is_pre_election: true
I20251028 09:09:59.654016  8463 raft_consensus.cc:2468] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 5cf59e115bd641f8965613ab616c9b85 in term 0.
I20251028 09:09:59.654348  8725 raft_consensus.cc:2468] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 5cf59e115bd641f8965613ab616c9b85 in term 1.
I20251028 09:09:59.654330  8321 catalog_manager.cc:5649] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 reported cstate change: term changed from 0 to 1, leader changed from <none> to 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129). New cstate: current_term: 1 leader_uuid: "3d8ce816521247e18c0d5b9e130edc21" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: HEALTHY } } }
I20251028 09:09:59.654481  8527 leader_election.cc:304] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 
I20251028 09:09:59.654567  8807 raft_consensus.cc:2804] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader election won for term 1
I20251028 09:09:59.654711  8807 raft_consensus.cc:697] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 1 LEADER]: Becoming Leader. State: Replica: 5cf59e115bd641f8965613ab616c9b85, State: Running, Role: LEADER
I20251028 09:09:59.654793  8807 consensus_queue.cc:237] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.655463  8321 catalog_manager.cc:5649] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 reported cstate change: term changed from 0 to 1, leader changed from <none> to 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130). New cstate: current_term: 1 leader_uuid: "5cf59e115bd641f8965613ab616c9b85" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: UNKNOWN } } }
I20251028 09:09:59.668236  8793 raft_consensus.cc:493] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:09:59.668332  8793 raft_consensus.cc:515] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.668625  8793 leader_election.cc:290] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:09:59.671820  8594 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "189fdd871f304eb888387f1f15973390" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85" is_pre_election: true
I20251028 09:09:59.671835  8463 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "189fdd871f304eb888387f1f15973390" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3d8ce816521247e18c0d5b9e130edc21" is_pre_election: true
I20251028 09:09:59.671911  8594 raft_consensus.cc:2468] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a73a6d78104342d7bfeb363acfac482b in term 0.
I20251028 09:09:59.671922  8463 raft_consensus.cc:2468] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a73a6d78104342d7bfeb363acfac482b in term 0.
I20251028 09:09:59.672071  8661 leader_election.cc:304] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 
I20251028 09:09:59.672214  8793 raft_consensus.cc:2804] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251028 09:09:59.672284  8793 raft_consensus.cc:493] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:09:59.672307  8793 raft_consensus.cc:3060] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.672868  8793 raft_consensus.cc:515] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.673010  8793 leader_election.cc:290] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 1 election: Requested vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:09:59.673180  8463 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "189fdd871f304eb888387f1f15973390" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3d8ce816521247e18c0d5b9e130edc21"
I20251028 09:09:59.673192  8594 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "189fdd871f304eb888387f1f15973390" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85"
I20251028 09:09:59.673255  8463 raft_consensus.cc:3060] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.673280  8594 raft_consensus.cc:3060] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.673802  8594 raft_consensus.cc:2468] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a73a6d78104342d7bfeb363acfac482b in term 1.
I20251028 09:09:59.673812  8463 raft_consensus.cc:2468] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a73a6d78104342d7bfeb363acfac482b in term 1.
I20251028 09:09:59.673978  8661 leader_election.cc:304] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 
I20251028 09:09:59.674070  8793 raft_consensus.cc:2804] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Leader election won for term 1
I20251028 09:09:59.674230  8793 raft_consensus.cc:697] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 1 LEADER]: Becoming Leader. State: Replica: a73a6d78104342d7bfeb363acfac482b, State: Running, Role: LEADER
I20251028 09:09:59.674317  8793 consensus_queue.cc:237] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.675009  8797 raft_consensus.cc:493] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:09:59.675063  8797 raft_consensus.cc:515] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.675076  8321 catalog_manager.cc:5649] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b reported cstate change: term changed from 0 to 1, leader changed from <none> to a73a6d78104342d7bfeb363acfac482b (127.8.22.131). New cstate: current_term: 1 leader_uuid: "a73a6d78104342d7bfeb363acfac482b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: HEALTHY } } }
I20251028 09:09:59.675210  8797 leader_election.cc:290] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:09:59.675396  8725 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "bbe94a484b0c4198bb904ffa7d3ef621" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b" is_pre_election: true
I20251028 09:09:59.675406  8594 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "bbe94a484b0c4198bb904ffa7d3ef621" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85" is_pre_election: true
I20251028 09:09:59.675474  8594 raft_consensus.cc:2468] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 0.
I20251028 09:09:59.675472  8725 raft_consensus.cc:2468] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 0.
I20251028 09:09:59.675616  8399 leader_election.cc:304] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3d8ce816521247e18c0d5b9e130edc21, 5cf59e115bd641f8965613ab616c9b85; no voters: 
I20251028 09:09:59.675697  8797 raft_consensus.cc:2804] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251028 09:09:59.675727  8797 raft_consensus.cc:493] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:09:59.675751  8797 raft_consensus.cc:3060] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.676429  8797 raft_consensus.cc:515] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.676527  8797 leader_election.cc:290] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 election: Requested vote from peers 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:09:59.676658  8725 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "bbe94a484b0c4198bb904ffa7d3ef621" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b"
I20251028 09:09:59.676728  8725 raft_consensus.cc:3060] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.676740  8594 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "bbe94a484b0c4198bb904ffa7d3ef621" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85"
I20251028 09:09:59.676815  8594 raft_consensus.cc:3060] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.677242  8725 raft_consensus.cc:2468] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 1.
I20251028 09:09:59.677302  8594 raft_consensus.cc:2468] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3d8ce816521247e18c0d5b9e130edc21 in term 1.
I20251028 09:09:59.677424  8396 leader_election.cc:304] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3d8ce816521247e18c0d5b9e130edc21, a73a6d78104342d7bfeb363acfac482b; no voters: 
I20251028 09:09:59.677521  8797 raft_consensus.cc:2804] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Leader election won for term 1
I20251028 09:09:59.677577  8797 raft_consensus.cc:697] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 LEADER]: Becoming Leader. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Running, Role: LEADER
I20251028 09:09:59.677629  8797 consensus_queue.cc:237] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.678182  8321 catalog_manager.cc:5649] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 reported cstate change: term changed from 0 to 1, leader changed from <none> to 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129). New cstate: current_term: 1 leader_uuid: "3d8ce816521247e18c0d5b9e130edc21" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: UNKNOWN } } }
I20251028 09:09:59.701757  8807 raft_consensus.cc:493] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:09:59.701865  8807 raft_consensus.cc:515] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.702013  8807 leader_election.cc:290] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:09:59.702332  8463 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "61b94d4520cd4af999e8e38b3a475ac4" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3d8ce816521247e18c0d5b9e130edc21" is_pre_election: true
I20251028 09:09:59.702319  8725 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "61b94d4520cd4af999e8e38b3a475ac4" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b" is_pre_election: true
I20251028 09:09:59.702437  8463 raft_consensus.cc:2468] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 5cf59e115bd641f8965613ab616c9b85 in term 0.
I20251028 09:09:59.702438  8725 raft_consensus.cc:2468] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 5cf59e115bd641f8965613ab616c9b85 in term 0.
I20251028 09:09:59.702595  8527 leader_election.cc:304] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 
I20251028 09:09:59.702732  8807 raft_consensus.cc:2804] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251028 09:09:59.702783  8807 raft_consensus.cc:493] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:09:59.702798  8807 raft_consensus.cc:3060] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.703455  8807 raft_consensus.cc:515] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.703589  8807 leader_election.cc:290] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 1 election: Requested vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:09:59.703759  8463 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "61b94d4520cd4af999e8e38b3a475ac4" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3d8ce816521247e18c0d5b9e130edc21"
I20251028 09:09:59.703761  8725 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "61b94d4520cd4af999e8e38b3a475ac4" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b"
I20251028 09:09:59.703830  8463 raft_consensus.cc:3060] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.703836  8725 raft_consensus.cc:3060] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.704371  8463 raft_consensus.cc:2468] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 5cf59e115bd641f8965613ab616c9b85 in term 1.
I20251028 09:09:59.704396  8725 raft_consensus.cc:2468] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 5cf59e115bd641f8965613ab616c9b85 in term 1.
I20251028 09:09:59.704566  8528 leader_election.cc:304] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3d8ce816521247e18c0d5b9e130edc21, 5cf59e115bd641f8965613ab616c9b85; no voters: 
I20251028 09:09:59.704665  8807 raft_consensus.cc:2804] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader election won for term 1
I20251028 09:09:59.704716  8807 raft_consensus.cc:697] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 1 LEADER]: Becoming Leader. State: Replica: 5cf59e115bd641f8965613ab616c9b85, State: Running, Role: LEADER
I20251028 09:09:59.704748  8807 consensus_queue.cc:237] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:09:59.705370  8321 catalog_manager.cc:5649] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 reported cstate change: term changed from 0 to 1, leader changed from <none> to 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130). New cstate: current_term: 1 leader_uuid: "5cf59e115bd641f8965613ab616c9b85" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: UNKNOWN } } }
W20251028 09:09:59.706872  8641 tablet.cc:2378] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20251028 09:09:59.707019  8641 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20251028 09:09:59.719806  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.132:0
--local_ip_for_outbound_sockets=127.8.22.132
--webserver_interface=127.8.22.132
--webserver_port=0
--tserver_master_addrs=127.8.22.190:38971
--builtin_ntp_servers=127.8.22.148:34983
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20251028 09:09:59.731027  8797 consensus_queue.cc:1048] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:09:59.737485  8724 raft_consensus.cc:1275] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Refusing update from remote peer 3d8ce816521247e18c0d5b9e130edc21: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:09:59.737640  8723 raft_consensus.cc:1275] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Refusing update from remote peer 3d8ce816521247e18c0d5b9e130edc21: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:09:59.738310  8816 consensus_queue.cc:1048] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [LEADER]: Connected to new peer: Peer: permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:09:59.738471  8593 raft_consensus.cc:1275] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Refusing update from remote peer 3d8ce816521247e18c0d5b9e130edc21: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:09:59.738592  8797 consensus_queue.cc:1048] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:09:59.738713  8797 consensus_queue.cc:1048] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:09:59.738866  8594 raft_consensus.cc:1275] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Refusing update from remote peer 3d8ce816521247e18c0d5b9e130edc21: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:09:59.739107  8816 consensus_queue.cc:1048] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [LEADER]: Connected to new peer: Peer: permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:09:59.739238  8816 consensus_queue.cc:1048] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [LEADER]: Connected to new peer: Peer: permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:09:59.739454  8725 raft_consensus.cc:1275] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Refusing update from remote peer 5cf59e115bd641f8965613ab616c9b85: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:09:59.739858  8593 raft_consensus.cc:1275] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Refusing update from remote peer a73a6d78104342d7bfeb363acfac482b: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:09:59.740063  8819 consensus_queue.cc:1048] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:09:59.740669  8463 raft_consensus.cc:1275] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Refusing update from remote peer a73a6d78104342d7bfeb363acfac482b: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:09:59.740912  8793 consensus_queue.cc:1048] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Connected to new peer: Peer: permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:09:59.741204  8793 consensus_queue.cc:1048] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Connected to new peer: Peer: permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:09:59.741794  8722 raft_consensus.cc:1275] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Refusing update from remote peer 5cf59e115bd641f8965613ab616c9b85: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:09:59.742007  8807 consensus_queue.cc:1048] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:09:59.742913  8462 raft_consensus.cc:1275] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Refusing update from remote peer 5cf59e115bd641f8965613ab616c9b85: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:09:59.743052  8461 raft_consensus.cc:3060] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:09:59.743516  8836 mvcc.cc:204] Tried to move back new op lower bound from 7215688088526065664 to 7215688088279838720. Current Snapshot: MvccSnapshot[applied={T|T < 7215688088522391552}]
I20251028 09:09:59.743937  8461 raft_consensus.cc:1275] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Refusing update from remote peer 5cf59e115bd641f8965613ab616c9b85: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:09:59.744951  8843 mvcc.cc:204] Tried to move back new op lower bound from 7215688088541401088 to 7215688088186535936. Current Snapshot: MvccSnapshot[applied={T|T < 7215688088532201472}]
I20251028 09:09:59.745396  8840 mvcc.cc:204] Tried to move back new op lower bound from 7215688088541401088 to 7215688088186535936. Current Snapshot: MvccSnapshot[applied={T|T < 7215688088532201472}]
I20251028 09:09:59.745436  8819 consensus_queue.cc:1048] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:09:59.745530  8819 consensus_queue.cc:1048] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20251028 09:09:59.830281  8772 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
W20251028 09:09:59.843056  8510 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
W20251028 09:09:59.892153  8830 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:09:59.892565  8830 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:09:59.892656  8830 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251028 09:09:59.892738  8830 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:09:59.895351  8830 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:09:59.895500  8830 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.132
I20251028 09:09:59.901326  8830 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:34983
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.132:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.8.22.132
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.8.22.190:38971
--never_fsync=true
--heap_profile_path=/tmp/kudu.8830
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.132
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:09:59.901875  8830 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:09:59.902284  8830 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:09:59.905750  8890 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:09:59.906639  8888 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:09:59.906648  8887 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:09:59.907759  8830 server_base.cc:1047] running on GCE node
I20251028 09:09:59.908061  8830 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:09:59.908356  8830 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:09:59.923867  8830 hybrid_clock.cc:648] HybridClock initialized: now 1761642599923829 us; error 53 us; skew 500 ppm
I20251028 09:09:59.925856  8830 webserver.cc:492] Webserver started at http://127.8.22.132:45171/ using document root <none> and password file <none>
I20251028 09:09:59.926157  8830 fs_manager.cc:362] Metadata directory not provided
I20251028 09:09:59.926225  8830 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:09:59.926379  8830 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251028 09:09:59.927601  8830 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/data/instance:
uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.928037  8830 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/wal/instance:
uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.930871  8830 fs_manager.cc:696] Time spent creating directory manager: real 0.003s	user 0.002s	sys 0.000s
I20251028 09:09:59.932050  8898 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.932212  8830 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:09:59.932287  8830 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/wal
uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:09:59.932351  8830 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:09:59.955852  8830 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:09:59.956358  8830 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:09:59.956909  8830 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:09:59.957587  8830 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:09:59.958020  8830 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:09:59.958231  8830 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.958356  8830 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:09:59.958431  8830 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:09:59.966200  8830 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.132:34091
I20251028 09:09:59.966691  8830 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
I20251028 09:09:59.970147  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 8830
I20251028 09:09:59.970328  8282 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-3/wal/instance
I20251028 09:09:59.977552  9012 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.132:34091 every 8 connection(s)
I20251028 09:10:00.069180  9013 heartbeater.cc:344] Connected to a master server at 127.8.22.190:38971
I20251028 09:10:00.069370  9013 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:00.073297  9013 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:10:00.074805  8313 ts_manager.cc:194] Registered new tserver with Master: 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132:34091)
I20251028 09:10:00.075423  8313 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.132:40381
I20251028 09:10:00.225870  8313 ts_manager.cc:295] Set tserver state for 3d8ce816521247e18c0d5b9e130edc21 to MAINTENANCE_MODE
I20251028 09:10:00.226377  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 8381
W20251028 09:10:00.235493  8659 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv got EOF from 127.8.22.129:41547 (error 108)
W20251028 09:10:00.235641  8528 connection.cc:537] server connection from 127.8.22.129:49635 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251028 09:10:00.235705  8780 connection.cc:537] client connection to 127.8.22.129:41547 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251028 09:10:00.235796  8528 connection.cc:537] client connection to 127.8.22.129:41547 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251028 09:10:00.235836  8780 meta_cache.cc:302] tablet 386ef2b749f24de9aac6947b29d3479c: replica 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547) has failed: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251028 09:10:00.235855  8528 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251028 09:10:00.236296  8661 connection.cc:537] server connection from 127.8.22.129:42399 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251028 09:10:00.236553  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:00.236613  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:00.236670  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:00.237396  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.237401  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.237601  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.238426  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.239674  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.239729  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.239729  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.240031  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.240767  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.240767  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.241947  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.242184  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.244637  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.244655  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.245146  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.245503  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.246661  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.247779  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.247817  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.254825  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.256088  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.259006  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.260596  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.261390  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.262286  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.262709  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.265235  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.265482  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.269161  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.274348  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.275851  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.277992  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.279424  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.280531  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.281890  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.287863  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.290620  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.290620  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.290645  8550 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.295493  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.299291  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.302418  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.309578  8550 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.316830  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.327672  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.327672  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.328852  8550 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.332404  8550 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.335366  8550 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.336459  8550 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.337579  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.337579  8550 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.342520  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.343585  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.345189  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.349838  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.354198  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.354198  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.365828  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.375214  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.376312  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.376331  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.382431  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.387887  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.391104  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.392201  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.395335  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.397058  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.400084  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.402671  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.404667  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.428403  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.429512  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.430702  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.435907  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.437985  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.440160  8682 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.440213  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.446148  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.448791  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.450906  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.452531  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.459369  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.462611  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.481279  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.493211  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.497313  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.498534  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.503511  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.506192  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.507611  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.508670  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.510787  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.517289  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.522518  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.522624  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
I20251028 09:10:00.523083  8846 raft_consensus.cc:493] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 3d8ce816521247e18c0d5b9e130edc21)
I20251028 09:10:00.523139  8846 raft_consensus.cc:515] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:10:00.523298  8846 leader_election.cc:290] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055), 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547)
I20251028 09:10:00.523437  8721 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a208b359e1a44c018a8ad71c60ff8fed" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 2 candidate_status { last_received { term: 1 index: 168 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b" is_pre_election: true
W20251028 09:10:00.523869  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.523880  8528 leader_election.cc:336] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
I20251028 09:10:00.523932  8528 leader_election.cc:304] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85; no voters: 3d8ce816521247e18c0d5b9e130edc21, a73a6d78104342d7bfeb363acfac482b
I20251028 09:10:00.524016  8807 raft_consensus.cc:2749] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251028 09:10:00.525166  8822 raft_consensus.cc:493] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 3d8ce816521247e18c0d5b9e130edc21)
I20251028 09:10:00.525221  8846 raft_consensus.cc:493] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 3d8ce816521247e18c0d5b9e130edc21)
I20251028 09:10:00.525244  8822 raft_consensus.cc:515] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:10:00.525292  8846 raft_consensus.cc:515] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:00.525393  8822 leader_election.cc:290] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821), 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547)
I20251028 09:10:00.525439  8822 raft_consensus.cc:493] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 3d8ce816521247e18c0d5b9e130edc21)
I20251028 09:10:00.525444  8846 leader_election.cc:290] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:10:00.525471  8822 raft_consensus.cc:515] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:00.525492  8904 raft_consensus.cc:493] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 3d8ce816521247e18c0d5b9e130edc21)
I20251028 09:10:00.525565  8822 leader_election.cc:290] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:10:00.525547  8904 raft_consensus.cc:515] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:00.525651  8904 leader_election.cc:290] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:10:00.525648  8846 raft_consensus.cc:493] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 3d8ce816521247e18c0d5b9e130edc21)
I20251028 09:10:00.525691  8846 raft_consensus.cc:515] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:00.525719  8592 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a208b359e1a44c018a8ad71c60ff8fed" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 2 candidate_status { last_received { term: 1 index: 169 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85" is_pre_election: true
I20251028 09:10:00.525779  8592 raft_consensus.cc:2468] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a73a6d78104342d7bfeb363acfac482b in term 1.
I20251028 09:10:00.525794  8846 leader_election.cc:290] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:10:00.525867  8592 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "386ef2b749f24de9aac6947b29d3479c" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 2 candidate_status { last_received { term: 1 index: 169 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85" is_pre_election: true
I20251028 09:10:00.525921  8591 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "bbe94a484b0c4198bb904ffa7d3ef621" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 2 candidate_status { last_received { term: 1 index: 169 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85" is_pre_election: true
I20251028 09:10:00.525921  8592 raft_consensus.cc:2468] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a73a6d78104342d7bfeb363acfac482b in term 1.
I20251028 09:10:00.526023  8591 raft_consensus.cc:2468] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a73a6d78104342d7bfeb363acfac482b in term 1.
I20251028 09:10:00.526116  8661 leader_election.cc:304] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 
I20251028 09:10:00.526212  8904 raft_consensus.cc:2804] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Leader pre-election won for term 2
I20251028 09:10:00.526242  8904 raft_consensus.cc:493] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Starting leader election (detected failure of leader 3d8ce816521247e18c0d5b9e130edc21)
I20251028 09:10:00.526226  8721 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "386ef2b749f24de9aac6947b29d3479c" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 2 candidate_status { last_received { term: 1 index: 168 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b" is_pre_election: true
I20251028 09:10:00.526271  8904 raft_consensus.cc:3060] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Advancing to term 2
W20251028 09:10:00.526283  8659 leader_election.cc:336] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
W20251028 09:10:00.526341  8659 leader_election.cc:336] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
W20251028 09:10:00.526357  8659 leader_election.cc:336] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
I20251028 09:10:00.526299  8721 raft_consensus.cc:2410] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 5cf59e115bd641f8965613ab616c9b85 for term 2 because replica has last-logged OpId of term: 1 index: 169, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 168.
W20251028 09:10:00.526400  8528 leader_election.cc:336] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
W20251028 09:10:00.526436  8528 leader_election.cc:336] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
W20251028 09:10:00.526589  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
I20251028 09:10:00.526597  8721 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "bbe94a484b0c4198bb904ffa7d3ef621" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 2 candidate_status { last_received { term: 1 index: 169 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b" is_pre_election: true
I20251028 09:10:00.526650  8721 raft_consensus.cc:2468] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 5cf59e115bd641f8965613ab616c9b85 in term 1.
I20251028 09:10:00.526707  8527 leader_election.cc:304] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85; no voters: 3d8ce816521247e18c0d5b9e130edc21, a73a6d78104342d7bfeb363acfac482b
I20251028 09:10:00.526785  8807 raft_consensus.cc:2749] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251028 09:10:00.526851  8527 leader_election.cc:304] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 3d8ce816521247e18c0d5b9e130edc21
I20251028 09:10:00.526907  8807 raft_consensus.cc:2804] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20251028 09:10:00.526934  8807 raft_consensus.cc:493] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Starting leader election (detected failure of leader 3d8ce816521247e18c0d5b9e130edc21)
I20251028 09:10:00.526976  8807 raft_consensus.cc:3060] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Advancing to term 2
I20251028 09:10:00.527091  8661 leader_election.cc:304] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 3d8ce816521247e18c0d5b9e130edc21
I20251028 09:10:00.527223  8661 leader_election.cc:304] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 3d8ce816521247e18c0d5b9e130edc21
I20251028 09:10:00.527323  8882 raft_consensus.cc:2804] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Leader pre-election won for term 2
I20251028 09:10:00.527355  8882 raft_consensus.cc:493] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Starting leader election (detected failure of leader 3d8ce816521247e18c0d5b9e130edc21)
I20251028 09:10:00.527377  8882 raft_consensus.cc:3060] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Advancing to term 2
I20251028 09:10:00.527366  8904 raft_consensus.cc:515] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:10:00.527490  8904 leader_election.cc:290] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 election: Requested vote from peers 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821), 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547)
I20251028 09:10:00.527536  8904 raft_consensus.cc:2804] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Leader pre-election won for term 2
I20251028 09:10:00.527565  8904 raft_consensus.cc:493] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Starting leader election (detected failure of leader 3d8ce816521247e18c0d5b9e130edc21)
I20251028 09:10:00.527588  8904 raft_consensus.cc:3060] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Advancing to term 2
I20251028 09:10:00.527751  8807 raft_consensus.cc:515] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:00.527863  8807 leader_election.cc:290] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 election: Requested vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:10:00.528112  8882 raft_consensus.cc:515] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:00.528205  8904 raft_consensus.cc:515] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:00.528259  8882 leader_election.cc:290] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 election: Requested vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:10:00.528303  8904 leader_election.cc:290] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 election: Requested vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:10:00.528555  8591 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a208b359e1a44c018a8ad71c60ff8fed" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 2 candidate_status { last_received { term: 1 index: 169 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85"
I20251028 09:10:00.528602  8592 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "386ef2b749f24de9aac6947b29d3479c" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 2 candidate_status { last_received { term: 1 index: 169 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85"
I20251028 09:10:00.528654  8592 raft_consensus.cc:3060] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Advancing to term 2
I20251028 09:10:00.528605  8593 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "bbe94a484b0c4198bb904ffa7d3ef621" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 2 candidate_status { last_received { term: 1 index: 169 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85"
I20251028 09:10:00.528635  8591 raft_consensus.cc:3060] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Advancing to term 2
I20251028 09:10:00.528736  8593 raft_consensus.cc:2393] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate a73a6d78104342d7bfeb363acfac482b in current term 2: Already voted for candidate 5cf59e115bd641f8965613ab616c9b85 in this term.
I20251028 09:10:00.529503  8592 raft_consensus.cc:2468] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a73a6d78104342d7bfeb363acfac482b in term 2.
I20251028 09:10:00.529582  8591 raft_consensus.cc:2468] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a73a6d78104342d7bfeb363acfac482b in term 2.
W20251028 09:10:00.529672  8528 leader_election.cc:336] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
I20251028 09:10:00.529698  8661 leader_election.cc:304] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 
W20251028 09:10:00.529840  8659 leader_election.cc:336] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
I20251028 09:10:00.529891  8661 leader_election.cc:304] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 
W20251028 09:10:00.529948  8659 leader_election.cc:336] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
I20251028 09:10:00.529958  8882 raft_consensus.cc:2804] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Leader election won for term 2
I20251028 09:10:00.530035  8904 raft_consensus.cc:2804] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Leader election won for term 2
I20251028 09:10:00.530047  8882 raft_consensus.cc:697] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 2 LEADER]: Becoming Leader. State: Replica: a73a6d78104342d7bfeb363acfac482b, State: Running, Role: LEADER
I20251028 09:10:00.530077  8904 raft_consensus.cc:697] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 2 LEADER]: Becoming Leader. State: Replica: a73a6d78104342d7bfeb363acfac482b, State: Running, Role: LEADER
I20251028 09:10:00.530110  8904 consensus_queue.cc:237] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 167, Committed index: 167, Last appended: 1.169, Last appended by leader: 169, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:10:00.529832  8721 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "bbe94a484b0c4198bb904ffa7d3ef621" candidate_uuid: "5cf59e115bd641f8965613ab616c9b85" candidate_term: 2 candidate_status { last_received { term: 1 index: 169 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b"
I20251028 09:10:00.530234  8721 raft_consensus.cc:2393] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate 5cf59e115bd641f8965613ab616c9b85 in current term 2: Already voted for candidate a73a6d78104342d7bfeb363acfac482b in this term.
I20251028 09:10:00.530098  8882 consensus_queue.cc:237] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 168, Committed index: 168, Last appended: 1.169, Last appended by leader: 169, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:00.529987  8659 leader_election.cc:304] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: a73a6d78104342d7bfeb363acfac482b; no voters: 3d8ce816521247e18c0d5b9e130edc21, 5cf59e115bd641f8965613ab616c9b85
I20251028 09:10:00.530416  8882 raft_consensus.cc:2749] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
W20251028 09:10:00.530437  8659 leader_election.cc:336] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
I20251028 09:10:00.530635  8527 leader_election.cc:304] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85; no voters: 3d8ce816521247e18c0d5b9e130edc21, a73a6d78104342d7bfeb363acfac482b
I20251028 09:10:00.530721  8807 raft_consensus.cc:2749] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
I20251028 09:10:00.530784  8313 catalog_manager.cc:5649] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b reported cstate change: term changed from 1 to 2, leader changed from 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129) to a73a6d78104342d7bfeb363acfac482b (127.8.22.131). New cstate: current_term: 2 leader_uuid: "a73a6d78104342d7bfeb363acfac482b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: HEALTHY } } }
I20251028 09:10:00.530901  8313 catalog_manager.cc:5649] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b reported cstate change: term changed from 1 to 2, leader changed from 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129) to a73a6d78104342d7bfeb363acfac482b (127.8.22.131). New cstate: current_term: 2 leader_uuid: "a73a6d78104342d7bfeb363acfac482b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } }
W20251028 09:10:00.532655  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.536908  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
I20251028 09:10:00.544368  8591 raft_consensus.cc:1275] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Refusing update from remote peer a73a6d78104342d7bfeb363acfac482b: Log matching property violated. Preceding OpId in replica: term: 1 index: 168. Preceding OpId from leader: term: 2 index: 171. (index mismatch)
W20251028 09:10:00.544580  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251028 09:10:00.544632  8793 consensus_queue.cc:1048] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [LEADER]: Connected to new peer: Peer: permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 170, Last known committed idx: 165, Time since last communication: 0.000s
I20251028 09:10:00.558497  8591 raft_consensus.cc:1275] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Refusing update from remote peer a73a6d78104342d7bfeb363acfac482b: Log matching property violated. Preceding OpId in replica: term: 1 index: 168. Preceding OpId from leader: term: 2 index: 171. (index mismatch)
I20251028 09:10:00.558743  8882 consensus_queue.cc:1048] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [LEADER]: Connected to new peer: Peer: permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 170, Last known committed idx: 168, Time since last communication: 0.000s
W20251028 09:10:00.558851  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:00.589577  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.613706  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.617446  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.627765  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.628835  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.643543  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.657361  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.674422  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.700256  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.704123  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.708520  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251028 09:10:00.719902  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.722283  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251028 09:10:00.733178  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.738080  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251028 09:10:00.754132  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.769186  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.801045  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.803764  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.821794  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.836994  8683 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49490: Illegal state: replica a73a6d78104342d7bfeb363acfac482b is not leader of this config: current role FOLLOWER
W20251028 09:10:00.856885  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:00.875219  8552 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
I20251028 09:10:00.876583  8793 raft_consensus.cc:493] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:00.876662  8793 raft_consensus.cc:515] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:00.876827  8793 leader_election.cc:290] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:10:00.877005  8591 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "bbe94a484b0c4198bb904ffa7d3ef621" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 3 candidate_status { last_received { term: 1 index: 169 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85" is_pre_election: true
I20251028 09:10:00.877089  8591 raft_consensus.cc:2468] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a73a6d78104342d7bfeb363acfac482b in term 2.
I20251028 09:10:00.877246  8661 leader_election.cc:304] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 
W20251028 09:10:00.877323  8659 leader_election.cc:336] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 3 pre-election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
I20251028 09:10:00.877346  8793 raft_consensus.cc:2804] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Leader pre-election won for term 3
I20251028 09:10:00.877419  8793 raft_consensus.cc:493] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:10:00.877453  8793 raft_consensus.cc:3060] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 2 FOLLOWER]: Advancing to term 3
I20251028 09:10:00.878100  8793 raft_consensus.cc:515] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 3 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:00.878233  8793 leader_election.cc:290] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 3 election: Requested vote from peers 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547), 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:10:00.878466  8591 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "bbe94a484b0c4198bb904ffa7d3ef621" candidate_uuid: "a73a6d78104342d7bfeb363acfac482b" candidate_term: 3 candidate_status { last_received { term: 1 index: 169 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85"
I20251028 09:10:00.878532  8591 raft_consensus.cc:3060] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Advancing to term 3
W20251028 09:10:00.878638  8659 leader_election.cc:336] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 3 election: RPC error from VoteRequest() call to peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
I20251028 09:10:00.879266  8591 raft_consensus.cc:2468] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a73a6d78104342d7bfeb363acfac482b in term 3.
I20251028 09:10:00.879485  8661 leader_election.cc:304] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b; no voters: 3d8ce816521247e18c0d5b9e130edc21
I20251028 09:10:00.879587  8793 raft_consensus.cc:2804] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 3 FOLLOWER]: Leader election won for term 3
I20251028 09:10:00.879647  8793 raft_consensus.cc:697] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 3 LEADER]: Becoming Leader. State: Replica: a73a6d78104342d7bfeb363acfac482b, State: Running, Role: LEADER
I20251028 09:10:00.879696  8793 consensus_queue.cc:237] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 168, Committed index: 168, Last appended: 1.169, Last appended by leader: 169, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:00.880175  8313 catalog_manager.cc:5649] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b reported cstate change: term changed from 1 to 3, leader changed from 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129) to a73a6d78104342d7bfeb363acfac482b (127.8.22.131). New cstate: current_term: 3 leader_uuid: "a73a6d78104342d7bfeb363acfac482b" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: HEALTHY } } }
I20251028 09:10:00.893908  8591 raft_consensus.cc:1275] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 3 FOLLOWER]: Refusing update from remote peer a73a6d78104342d7bfeb363acfac482b: Log matching property violated. Preceding OpId in replica: term: 1 index: 169. Preceding OpId from leader: term: 3 index: 171. (index mismatch)
I20251028 09:10:00.894169  8882 consensus_queue.cc:1048] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Connected to new peer: Peer: permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 170, Last known committed idx: 168, Time since last communication: 0.000s
W20251028 09:10:00.894289  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:01.007625  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251028 09:10:01.026510  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
I20251028 09:10:01.086836  9013 heartbeater.cc:499] Master 127.8.22.190:38971 was elected leader, sending a full tablet report...
W20251028 09:10:01.159435  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:01.185097  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:01.190389  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:01.244952  8780 meta_cache.cc:302] tablet 386ef2b749f24de9aac6947b29d3479c: replica 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547) has failed: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
W20251028 09:10:01.394660  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251028 09:10:01.433557  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:01.435829  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:01.436151  8548 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41694: Illegal state: replica 5cf59e115bd641f8965613ab616c9b85 is not leader of this config: current role FOLLOWER
W20251028 09:10:01.462914  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:01.534921  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:01.649892  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251028 09:10:01.665055  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251028 09:10:01.717311  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251028 09:10:01.926782  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:02.032733  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251028 09:10:02.040971  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251028 09:10:02.134626  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20251028 09:10:02.142511  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251028 09:10:02.225420  9049 consensus_queue.cc:579] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.003s)
W20251028 09:10:02.225775  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251028 09:10:02.259583  9049 consensus_queue.cc:579] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.038s)
I20251028 09:10:02.330190  9044 consensus_queue.cc:579] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.105s)
W20251028 09:10:02.407155  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251028 09:10:02.527320  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251028 09:10:02.536255  8904 consensus_queue.cc:579] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.006s)
W20251028 09:10:02.542814  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251028 09:10:02.604779  9043 consensus_queue.cc:579] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.075s)
W20251028 09:10:02.646611  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251028 09:10:02.667119  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251028 09:10:02.716259  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
I20251028 09:10:02.953127  9028 consensus_queue.cc:579] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.073s)
W20251028 09:10:02.955550  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20251028 09:10:03.016498  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251028 09:10:03.061383  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251028 09:10:03.189684  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251028 09:10:03.200179  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251028 09:10:03.218513  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20251028 09:10:03.263199  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 8290
W20251028 09:10:03.273715  8530 connection.cc:537] client connection to 127.8.22.190:38971 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251028 09:10:03.273936  8640 heartbeater.cc:646] Failed to heartbeat to 127.8.22.190:38971 (0 consecutive failures): Network error: Failed to send heartbeat to master: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20251028 09:10:03.274597  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.8.22.190:38971
--webserver_interface=127.8.22.190
--webserver_port=35375
--builtin_ntp_servers=127.8.22.148:34983
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.8.22.190:38971 with env {}
W20251028 09:10:03.424528  9059 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:03.424860  9059 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:03.424950  9059 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:03.427249  9059 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20251028 09:10:03.428025  9059 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:03.428115  9059 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20251028 09:10:03.428184  9059 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20251028 09:10:03.430876  9059 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:34983
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.8.22.190:38971
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.8.22.190:38971
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.8.22.190
--webserver_port=35375
--never_fsync=true
--heap_profile_path=/tmp/kudu.9059
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:03.431370  9059 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:03.431708  9059 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:03.434834  9067 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:03.436566  9064 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:03.439309  9065 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:03.439898  9059 server_base.cc:1047] running on GCE node
I20251028 09:10:03.440196  9059 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:03.440591  9059 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:03.442202  9059 hybrid_clock.cc:648] HybridClock initialized: now 1761642603441499 us; error 371 us; skew 500 ppm
I20251028 09:10:03.444319  9059 webserver.cc:492] Webserver started at http://127.8.22.190:35375/ using document root <none> and password file <none>
I20251028 09:10:03.444747  9059 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:03.444902  9059 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:03.447124  9059 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:03.448036  9073 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:03.448243  9059 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:03.448319  9059 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal
uuid: "f523d2f8a0724bf98f9f9645cb64b032"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:10:03.448652  9059 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:03.470741  9059 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:03.471081  9059 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:03.471307  9059 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:03.476197  9059 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.190:38971
I20251028 09:10:03.476668  9059 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/master-0/data/info.pb
I20251028 09:10:03.477586  9125 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.190:38971 every 8 connection(s)
I20251028 09:10:03.478942  9126 sys_catalog.cc:263] Verifying existing consensus state
I20251028 09:10:03.479797  9126 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: Bootstrap starting.
I20251028 09:10:03.485710  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 9059
I20251028 09:10:03.490794  9126 log.cc:826] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:03.494380  9126 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: Bootstrap replayed 1/1 log segments. Stats: ops{read=15 overwritten=0 applied=15 ignored=0} inserts{seen=11 ignored=0} mutations{seen=15 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:03.494861  9126 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: Bootstrap complete.
I20251028 09:10:03.498876  9126 raft_consensus.cc:359] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } }
I20251028 09:10:03.499260  9126 raft_consensus.cc:740] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: f523d2f8a0724bf98f9f9645cb64b032, State: Initialized, Role: FOLLOWER
I20251028 09:10:03.499426  9126 consensus_queue.cc:260] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 15, Last appended: 1.15, Last appended by leader: 15, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } }
I20251028 09:10:03.499500  9126 raft_consensus.cc:399] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20251028 09:10:03.499558  9126 raft_consensus.cc:493] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20251028 09:10:03.499606  9126 raft_consensus.cc:3060] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 1 FOLLOWER]: Advancing to term 2
I20251028 09:10:03.500398  9126 raft_consensus.cc:515] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } }
I20251028 09:10:03.500504  9126 leader_election.cc:304] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: f523d2f8a0724bf98f9f9645cb64b032; no voters: 
I20251028 09:10:03.500701  9126 leader_election.cc:290] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [CANDIDATE]: Term 2 election: Requested vote from peers 
I20251028 09:10:03.500753  9129 raft_consensus.cc:2804] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 2 FOLLOWER]: Leader election won for term 2
I20251028 09:10:03.500916  9129 raft_consensus.cc:697] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [term 2 LEADER]: Becoming Leader. State: Replica: f523d2f8a0724bf98f9f9645cb64b032, State: Running, Role: LEADER
I20251028 09:10:03.501019  9126 sys_catalog.cc:565] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [sys.catalog]: configured and running, proceeding with master startup.
I20251028 09:10:03.501039  9129 consensus_queue.cc:237] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 15, Committed index: 15, Last appended: 1.15, Last appended by leader: 15, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } }
I20251028 09:10:03.501368  9129 sys_catalog.cc:455] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "f523d2f8a0724bf98f9f9645cb64b032" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } } }
I20251028 09:10:03.501503  9129 sys_catalog.cc:458] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [sys.catalog]: This master's current role is: LEADER
I20251028 09:10:03.502897  9129 sys_catalog.cc:455] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [sys.catalog]: SysCatalogTable state changed. Reason: New leader f523d2f8a0724bf98f9f9645cb64b032. Latest consensus state: current_term: 2 leader_uuid: "f523d2f8a0724bf98f9f9645cb64b032" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "f523d2f8a0724bf98f9f9645cb64b032" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 38971 } } }
I20251028 09:10:03.503182  9129 sys_catalog.cc:458] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032 [sys.catalog]: This master's current role is: LEADER
I20251028 09:10:03.503616  9143 catalog_manager.cc:1269] Loaded cluster ID: c77fe8589d5c4c16b651a4133596eec9
I20251028 09:10:03.503732  9143 catalog_manager.cc:1562] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: loading cluster ID for follower catalog manager: success
I20251028 09:10:03.504927  9143 catalog_manager.cc:1584] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: acquiring CA information for follower catalog manager: success
I20251028 09:10:03.505394  9143 catalog_manager.cc:1612] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: importing token verification keys for follower catalog manager: success; most recent TSK sequence number 0
I20251028 09:10:03.505514  9144 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20251028 09:10:03.505774  9144 catalog_manager.cc:679] Loaded metadata for table test-workload [id=b87c889f1b21497693f06a41b4761586]
I20251028 09:10:03.505957  9144 tablet_loader.cc:96] loaded metadata for tablet 189fdd871f304eb888387f1f15973390 (table test-workload [id=b87c889f1b21497693f06a41b4761586])
I20251028 09:10:03.506031  9144 tablet_loader.cc:96] loaded metadata for tablet 386ef2b749f24de9aac6947b29d3479c (table test-workload [id=b87c889f1b21497693f06a41b4761586])
I20251028 09:10:03.506072  9144 tablet_loader.cc:96] loaded metadata for tablet 61b94d4520cd4af999e8e38b3a475ac4 (table test-workload [id=b87c889f1b21497693f06a41b4761586])
I20251028 09:10:03.506100  9144 tablet_loader.cc:96] loaded metadata for tablet 9e00a8dbf00b4468bbbfc193e444299e (table test-workload [id=b87c889f1b21497693f06a41b4761586])
I20251028 09:10:03.506135  9144 tablet_loader.cc:96] loaded metadata for tablet a208b359e1a44c018a8ad71c60ff8fed (table test-workload [id=b87c889f1b21497693f06a41b4761586])
I20251028 09:10:03.506173  9144 tablet_loader.cc:96] loaded metadata for tablet bbe94a484b0c4198bb904ffa7d3ef621 (table test-workload [id=b87c889f1b21497693f06a41b4761586])
I20251028 09:10:03.506208  9144 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20251028 09:10:03.506281  9144 catalog_manager.cc:1269] Loaded cluster ID: c77fe8589d5c4c16b651a4133596eec9
I20251028 09:10:03.506307  9144 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20251028 09:10:03.506461  9144 catalog_manager.cc:1514] Loading token signing keys...
I20251028 09:10:03.506517  9144 catalog_manager.cc:6033] T 00000000000000000000000000000000 P f523d2f8a0724bf98f9f9645cb64b032: Loaded TSK: 0
I20251028 09:10:03.506639  9144 catalog_manager.cc:1524] Initializing in-progress tserver states...
W20251028 09:10:03.509558  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251028 09:10:03.555667  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251028 09:10:03.593307  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251028 09:10:03.671607  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20251028 09:10:03.711225  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20251028 09:10:03.788960  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20251028 09:10:03.964455  9090 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" instance_seqno: 1761642599577495) as {username='slave'} at 127.8.22.131:52533; Asking this server to re-register.
I20251028 09:10:03.965051  8771 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:03.965129  8771 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:10:03.966295  9090 ts_manager.cc:194] Registered new tserver with Master: a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
W20251028 09:10:04.028460  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20251028 09:10:04.037648  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251028 09:10:04.062310  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20251028 09:10:04.101183  9090 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" instance_seqno: 1761642599964289) as {username='slave'} at 127.8.22.132:57945; Asking this server to re-register.
I20251028 09:10:04.101511  9013 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:04.101599  9013 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:10:04.101842  9090 ts_manager.cc:194] Registered new tserver with Master: 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132:34091)
W20251028 09:10:04.148478  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251028 09:10:04.244465  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
I20251028 09:10:04.281184  8640 heartbeater.cc:344] Connected to a master server at 127.8.22.190:38971
I20251028 09:10:04.281780  9090 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" instance_seqno: 1761642599454509) as {username='slave'} at 127.8.22.130:37959; Asking this server to re-register.
I20251028 09:10:04.282478  8640 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:04.282562  8640 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:10:04.283074  9090 ts_manager.cc:194] Registered new tserver with Master: 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
W20251028 09:10:04.307207  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
I20251028 09:10:04.348496  9045 consensus_queue.cc:799] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [LEADER]: Peer 3d8ce816521247e18c0d5b9e130edc21 is lagging by at least 6 ops behind the committed index 
I20251028 09:10:04.377665  8793 consensus_queue.cc:799] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Peer 3d8ce816521247e18c0d5b9e130edc21 is lagging by at least 23 ops behind the committed index 
I20251028 09:10:04.393296  9028 consensus_queue.cc:799] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Peer 3d8ce816521247e18c0d5b9e130edc21 is lagging by at least 9 ops behind the committed index 
I20251028 09:10:04.393296  9030 consensus_queue.cc:799] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [LEADER]: Peer 3d8ce816521247e18c0d5b9e130edc21 is lagging by at least 20 ops behind the committed index 
I20251028 09:10:04.434197  9047 consensus_queue.cc:799] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Peer 3d8ce816521247e18c0d5b9e130edc21 is lagging by at least 39 ops behind the committed index 
I20251028 09:10:04.459244  9147 consensus_queue.cc:799] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Peer 3d8ce816521247e18c0d5b9e130edc21 is lagging by at least 52 ops behind the committed index 
W20251028 09:10:04.510622  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251028 09:10:04.584313  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20251028 09:10:04.596992  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251028 09:10:04.661777  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251028 09:10:04.737661  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251028 09:10:04.837170  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251028 09:10:05.035884  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251028 09:10:05.043573  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251028 09:10:05.097745  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251028 09:10:05.208619  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251028 09:10:05.241725  8659 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111) [suppressed 193 similar messages]
W20251028 09:10:05.282881  8528 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111) [suppressed 104 similar messages]
W20251028 09:10:05.302572  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251028 09:10:05.319877  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251028 09:10:05.575770  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251028 09:10:05.576381  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251028 09:10:05.576722  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251028 09:10:05.719383  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251028 09:10:05.795439  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251028 09:10:05.841241  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251028 09:10:06.092904  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251028 09:10:06.105693  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251028 09:10:06.154822  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251028 09:10:06.260154  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20251028 09:10:06.306380  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20251028 09:10:06.392547  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
I20251028 09:10:06.500749  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.129:41547
--local_ip_for_outbound_sockets=127.8.22.129
--tserver_master_addrs=127.8.22.190:38971
--webserver_port=38231
--webserver_interface=127.8.22.129
--builtin_ntp_servers=127.8.22.148:34983
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20251028 09:10:06.617411  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251028 09:10:06.635262  9170 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:06.635576  9170 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:06.635651  9170 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251028 09:10:06.635738  9170 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:06.638119  9170 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:06.638262  9170 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.129
I20251028 09:10:06.640713  9170 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:34983
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.129:41547
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.8.22.129
--webserver_port=38231
--enable_log_gc=false
--tserver_master_addrs=127.8.22.190:38971
--never_fsync=true
--heap_profile_path=/tmp/kudu.9170
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.129
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:06.641229  9170 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:06.641525  9170 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:06.644330  9176 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:06.644316  9178 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:06.644311  9175 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:06.644842  9170 server_base.cc:1047] running on GCE node
I20251028 09:10:06.645010  9170 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:06.645208  9170 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:06.646339  9170 hybrid_clock.cc:648] HybridClock initialized: now 1761642606646327 us; error 36 us; skew 500 ppm
I20251028 09:10:06.647655  9170 webserver.cc:492] Webserver started at http://127.8.22.129:38231/ using document root <none> and password file <none>
I20251028 09:10:06.647928  9170 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:06.648032  9170 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
W20251028 09:10:06.649088  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
I20251028 09:10:06.649662  9170 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:06.650892  9184 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:06.653614  9170 fs_manager.cc:730] Time spent opening block manager: real 0.003s	user 0.000s	sys 0.001s
I20251028 09:10:06.653713  9170 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal
uuid: "3d8ce816521247e18c0d5b9e130edc21"
format_stamp: "Formatted at 2025-10-28 09:09:59 on dist-test-slave-kqwd"
I20251028 09:10:06.654194  9170 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251028 09:10:06.657596  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
I20251028 09:10:06.684893  9170 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:06.685545  9170 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:06.685828  9170 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:06.686218  9170 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:06.687166  9191 ts_tablet_manager.cc:542] Loading tablet metadata (0/6 complete)
I20251028 09:10:06.690793  9170 ts_tablet_manager.cc:585] Loaded tablet metadata (6 total tablets, 6 live tablets)
I20251028 09:10:06.690927  9170 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.004s	user 0.000s	sys 0.000s
I20251028 09:10:06.691064  9170 ts_tablet_manager.cc:600] Registering tablets (0/6 complete)
I20251028 09:10:06.693542  9191 tablet_bootstrap.cc:492] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
I20251028 09:10:06.694243  9170 ts_tablet_manager.cc:616] Registered 6 tablets
I20251028 09:10:06.694288  9170 ts_tablet_manager.cc:595] Time spent register tablets: real 0.003s	user 0.002s	sys 0.000s
I20251028 09:10:06.702415  9170 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.129:41547
I20251028 09:10:06.702836  9170 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
I20251028 09:10:06.709765  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 9170
I20251028 09:10:06.720799  9298 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.129:41547 every 8 connection(s)
I20251028 09:10:06.723677  9191 log.cc:826] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:06.756497  9299 heartbeater.cc:344] Connected to a master server at 127.8.22.190:38971
I20251028 09:10:06.756614  9299 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:06.756875  9299 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:10:06.757536  9090 ts_manager.cc:194] Registered new tserver with Master: 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547)
I20251028 09:10:06.758231  9090 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.129:38605
I20251028 09:10:06.772624  9191 tablet_bootstrap.cc:492] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap replayed 1/1 log segments. Stats: ops{read=169 overwritten=0 applied=168 ignored=0} inserts{seen=1402 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251028 09:10:06.773032  9191 tablet_bootstrap.cc:492] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap complete.
I20251028 09:10:06.774291  9191 ts_tablet_manager.cc:1403] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.081s	user 0.014s	sys 0.019s
I20251028 09:10:06.775403  9191 raft_consensus.cc:359] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:06.777185  9191 raft_consensus.cc:740] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:10:06.777336  9191 consensus_queue.cc:260] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 168, Last appended: 1.169, Last appended by leader: 169, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:06.777557  9191 ts_tablet_manager.cc:1434] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.003s	user 0.005s	sys 0.000s
I20251028 09:10:06.777665  9299 heartbeater.cc:499] Master 127.8.22.190:38971 was elected leader, sending a full tablet report...
I20251028 09:10:06.777966  9191 tablet_bootstrap.cc:492] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
W20251028 09:10:06.807324  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 66: this message will repeat every 5th retry.
I20251028 09:10:06.822217  9191 tablet_bootstrap.cc:492] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap replayed 1/1 log segments. Stats: ops{read=169 overwritten=0 applied=168 ignored=0} inserts{seen=1371 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251028 09:10:06.822576  9191 tablet_bootstrap.cc:492] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap complete.
I20251028 09:10:06.823881  9191 ts_tablet_manager.cc:1403] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.046s	user 0.029s	sys 0.002s
I20251028 09:10:06.824051  9191 raft_consensus.cc:359] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:06.824280  9191 raft_consensus.cc:740] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:10:06.824378  9191 consensus_queue.cc:260] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 168, Last appended: 1.169, Last appended by leader: 169, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:06.824994  9191 ts_tablet_manager.cc:1434] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:06.826416  9191 tablet_bootstrap.cc:492] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
I20251028 09:10:06.857095  9191 tablet_bootstrap.cc:492] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap replayed 1/1 log segments. Stats: ops{read=165 overwritten=0 applied=164 ignored=0} inserts{seen=1378 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251028 09:10:06.857461  9191 tablet_bootstrap.cc:492] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap complete.
I20251028 09:10:06.858626  9191 ts_tablet_manager.cc:1403] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.032s	user 0.013s	sys 0.013s
I20251028 09:10:06.858791  9191 raft_consensus.cc:359] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:06.859045  9191 raft_consensus.cc:740] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:10:06.859109  9191 consensus_queue.cc:260] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 164, Last appended: 1.165, Last appended by leader: 165, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:06.859467  9191 ts_tablet_manager.cc:1434] T 9e00a8dbf00b4468bbbfc193e444299e P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.001s	user 0.003s	sys 0.000s
I20251028 09:10:06.859539  9191 tablet_bootstrap.cc:492] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
I20251028 09:10:06.867981  9241 raft_consensus.cc:3060] T bbe94a484b0c4198bb904ffa7d3ef621 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Advancing to term 3
W20251028 09:10:06.888779  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 66: this message will repeat every 5th retry.
I20251028 09:10:06.893630  9239 raft_consensus.cc:3060] T 386ef2b749f24de9aac6947b29d3479c P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Advancing to term 2
W20251028 09:10:06.938174  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 66: this message will repeat every 5th retry.
W20251028 09:10:06.972218  9300 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20251028 09:10:07.087388  9191 tablet_bootstrap.cc:492] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap replayed 1/1 log segments. Stats: ops{read=165 overwritten=0 applied=165 ignored=0} inserts{seen=1420 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:07.087790  9191 tablet_bootstrap.cc:492] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap complete.
I20251028 09:10:07.089104  9191 ts_tablet_manager.cc:1403] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.230s	user 0.021s	sys 0.008s
I20251028 09:10:07.089288  9191 raft_consensus.cc:359] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:07.089387  9191 raft_consensus.cc:740] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:10:07.101433  9191 consensus_queue.cc:260] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 165, Last appended: 1.165, Last appended by leader: 165, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:07.101630  9191 ts_tablet_manager.cc:1434] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.012s	user 0.002s	sys 0.000s
I20251028 09:10:07.103331  9191 tablet_bootstrap.cc:492] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
W20251028 09:10:07.197589  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 66: this message will repeat every 5th retry.
I20251028 09:10:07.241349  9315 mvcc.cc:204] Tried to move back new op lower bound from 7215688107309395968 to 7215688091772162048. Current Snapshot: MvccSnapshot[applied={T|T < 7215688102395473920}]
I20251028 09:10:07.312983  9191 tablet_bootstrap.cc:492] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap replayed 1/1 log segments. Stats: ops{read=169 overwritten=0 applied=168 ignored=0} inserts{seen=1393 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251028 09:10:07.339130  9191 tablet_bootstrap.cc:492] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap complete.
I20251028 09:10:07.340880  9191 ts_tablet_manager.cc:1403] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.238s	user 0.015s	sys 0.015s
I20251028 09:10:07.341043  9191 raft_consensus.cc:359] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:10:07.341217  9191 raft_consensus.cc:740] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:10:07.347013  9191 consensus_queue.cc:260] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 168, Last appended: 1.169, Last appended by leader: 169, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:10:07.347741  9191 ts_tablet_manager.cc:1434] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.007s	user 0.002s	sys 0.000s
I20251028 09:10:07.348240  9191 tablet_bootstrap.cc:492] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap starting.
I20251028 09:10:07.409222  9239 raft_consensus.cc:3060] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Advancing to term 2
I20251028 09:10:07.409703  8793 consensus_queue.cc:799] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Peer 3d8ce816521247e18c0d5b9e130edc21 is lagging by at least 1138 ops behind the committed index  [suppressed 28 similar messages]
W20251028 09:10:07.440821  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 71: this message will repeat every 5th retry.
I20251028 09:10:07.497849  9191 tablet_bootstrap.cc:492] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap replayed 1/1 log segments. Stats: ops{read=169 overwritten=0 applied=169 ignored=0} inserts{seen=1327 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:07.498219  9191 tablet_bootstrap.cc:492] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21: Bootstrap complete.
I20251028 09:10:07.499444  9191 ts_tablet_manager.cc:1403] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21: Time spent bootstrapping tablet: real 0.151s	user 0.017s	sys 0.014s
I20251028 09:10:07.502249  9191 raft_consensus.cc:359] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:07.502347  9191 raft_consensus.cc:740] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d8ce816521247e18c0d5b9e130edc21, State: Initialized, Role: FOLLOWER
I20251028 09:10:07.503389  9191 consensus_queue.cc:260] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 169, Last appended: 1.169, Last appended by leader: 169, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:07.503609  9191 ts_tablet_manager.cc:1434] T 189fdd871f304eb888387f1f15973390 P 3d8ce816521247e18c0d5b9e130edc21: Time spent starting tablet: real 0.004s	user 0.001s	sys 0.000s
I20251028 09:10:07.793550  9308 raft_consensus.cc:493] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 2 FOLLOWER]: Starting pre-election (detected failure of leader a73a6d78104342d7bfeb363acfac482b)
I20251028 09:10:07.793694  9308 raft_consensus.cc:515] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } }
I20251028 09:10:07.794047  9308 leader_election.cc:290] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055), 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:10:07.796577  9328 raft_consensus.cc:493] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 5cf59e115bd641f8965613ab616c9b85)
I20251028 09:10:07.796651  9328 raft_consensus.cc:515] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } }
I20251028 09:10:07.796846  9328 leader_election.cc:290] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821), a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:10:07.808301  8724 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a208b359e1a44c018a8ad71c60ff8fed" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 3 candidate_status { last_received { term: 2 index: 1736 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b" is_pre_election: true
I20251028 09:10:07.808475  8721 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "61b94d4520cd4af999e8e38b3a475ac4" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 2 candidate_status { last_received { term: 1 index: 2875 } } ignore_live_leader: false dest_uuid: "a73a6d78104342d7bfeb363acfac482b" is_pre_election: true
I20251028 09:10:07.815013  8594 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a208b359e1a44c018a8ad71c60ff8fed" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 3 candidate_status { last_received { term: 2 index: 1736 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85" is_pre_election: true
I20251028 09:10:07.815215  8593 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "61b94d4520cd4af999e8e38b3a475ac4" candidate_uuid: "3d8ce816521247e18c0d5b9e130edc21" candidate_term: 2 candidate_status { last_received { term: 1 index: 2875 } } ignore_live_leader: false dest_uuid: "5cf59e115bd641f8965613ab616c9b85" is_pre_election: true
I20251028 09:10:07.815964  9188 leader_election.cc:304] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3d8ce816521247e18c0d5b9e130edc21; no voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b
I20251028 09:10:07.816133  9188 leader_election.cc:304] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3d8ce816521247e18c0d5b9e130edc21; no voters: 5cf59e115bd641f8965613ab616c9b85, a73a6d78104342d7bfeb363acfac482b
I20251028 09:10:07.816289  9308 raft_consensus.cc:2749] T 61b94d4520cd4af999e8e38b3a475ac4 P 3d8ce816521247e18c0d5b9e130edc21 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251028 09:10:07.999693  9328 raft_consensus.cc:2749] T a208b359e1a44c018a8ad71c60ff8fed P 3d8ce816521247e18c0d5b9e130edc21 [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20251028 09:10:08.053252  9090 ts_manager.cc:284] Unset tserver state for 3d8ce816521247e18c0d5b9e130edc21 from MAINTENANCE_MODE
I20251028 09:10:08.105126  9013 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:10:08.389483  8640 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:10:08.832834  8771 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:10:09.002218  9299 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:10:11.123739  9090 ts_manager.cc:295] Set tserver state for 3d8ce816521247e18c0d5b9e130edc21 to MAINTENANCE_MODE
I20251028 09:10:11.124095  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 9170
W20251028 09:10:11.149998  8528 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv got EOF from 127.8.22.129:41547 (error 108) [suppressed 27 similar messages]
W20251028 09:10:11.150251  8659 connection.cc:537] client connection to 127.8.22.129:41547 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251028 09:10:11.150313  8659 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107) [suppressed 54 similar messages]
W20251028 09:10:11.153431  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:11.153501  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:11.153898  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:11.153968  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:11.153992  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:11.154018  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:11.597508  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251028 09:10:11.597828  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251028 09:10:11.633030  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251028 09:10:11.635515  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251028 09:10:11.654934  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251028 09:10:11.694622  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251028 09:10:12.107369  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:12.119580  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:12.125223  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:12.159176  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:12.203583  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:12.203672  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251028 09:10:12.589238  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251028 09:10:12.646762  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251028 09:10:12.657876  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251028 09:10:12.679936  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251028 09:10:12.708477  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251028 09:10:12.759209  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
I20251028 09:10:13.128317  9357 consensus_queue.cc:579] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.007s)
W20251028 09:10:13.134335  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251028 09:10:13.151154  9385 consensus_queue.cc:579] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.032s)
I20251028 09:10:13.151242  9326 consensus_queue.cc:579] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.030s)
W20251028 09:10:13.154769  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251028 09:10:13.164806  9355 consensus_queue.cc:579] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.042s)
I20251028 09:10:13.171820  9357 consensus_queue.cc:579] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.048s)
W20251028 09:10:13.176690  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251028 09:10:13.184782  9354 consensus_queue.cc:579] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Leader has been unable to successfully communicate with peer 3d8ce816521247e18c0d5b9e130edc21 for more than 2 seconds (2.063s)
W20251028 09:10:13.189877  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20251028 09:10:13.235159  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20251028 09:10:13.271560  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251028 09:10:13.371898  9090 ts_manager.cc:284] Unset tserver state for 3d8ce816521247e18c0d5b9e130edc21 from MAINTENANCE_MODE
W20251028 09:10:13.635103  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251028 09:10:13.682818  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251028 09:10:13.702473  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251028 09:10:13.709390  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251028 09:10:13.773864  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251028 09:10:13.778640  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
I20251028 09:10:14.109982  9013 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
W20251028 09:10:14.129335  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20251028 09:10:14.156198  8640 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:10:14.170188  8591 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 25 messages since previous log ~9 seconds ago
I20251028 09:10:14.170188  8593 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 23 messages since previous log ~9 seconds ago
I20251028 09:10:14.170298  8591 consensus_queue.cc:237] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5527, Committed index: 5527, Last appended: 1.5527, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5528 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } }
I20251028 09:10:14.170425  8593 consensus_queue.cc:237] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5528, Committed index: 5528, Last appended: 1.5528, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5529 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } }
I20251028 09:10:14.170804  8724 raft_consensus.cc:1275] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Refusing update from remote peer 5cf59e115bd641f8965613ab616c9b85: Log matching property violated. Preceding OpId in replica: term: 1 index: 5527. Preceding OpId from leader: term: 1 index: 5528. (index mismatch)
I20251028 09:10:14.170814  8725 raft_consensus.cc:1275] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Refusing update from remote peer 5cf59e115bd641f8965613ab616c9b85: Log matching property violated. Preceding OpId in replica: term: 1 index: 5528. Preceding OpId from leader: term: 1 index: 5529. (index mismatch)
I20251028 09:10:14.171034  9385 consensus_queue.cc:1048] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5528, Last known committed idx: 5527, Time since last communication: 0.000s
I20251028 09:10:14.171075  9358 consensus_queue.cc:1048] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5529, Last known committed idx: 5528, Time since last communication: 0.000s
I20251028 09:10:14.171763  9358 raft_consensus.cc:2955] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 [term 1 LEADER]: Committing config change with OpId 1.5528: config changed from index -1 to 5528, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5528 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
I20251028 09:10:14.171772  9384 raft_consensus.cc:2955] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 [term 1 LEADER]: Committing config change with OpId 1.5529: config changed from index -1 to 5529, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5529 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
W20251028 09:10:14.171916  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:14.171962  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251028 09:10:14.171926  8724 raft_consensus.cc:2955] T 9e00a8dbf00b4468bbbfc193e444299e P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Committing config change with OpId 1.5529: config changed from index -1 to 5529, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5529 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
I20251028 09:10:14.173032  9077 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 9e00a8dbf00b4468bbbfc193e444299e with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251028 09:10:14.171936  8725 raft_consensus.cc:2955] T 61b94d4520cd4af999e8e38b3a475ac4 P a73a6d78104342d7bfeb363acfac482b [term 1 FOLLOWER]: Committing config change with OpId 1.5528: config changed from index -1 to 5528, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5528 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
I20251028 09:10:14.173350  9077 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 61b94d4520cd4af999e8e38b3a475ac4 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251028 09:10:14.173319  9090 catalog_manager.cc:5649] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 reported cstate change: config changed from index -1 to 5529, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New cstate: current_term: 1 leader_uuid: "5cf59e115bd641f8965613ab616c9b85" committed_config { opid_index: 5529 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20251028 09:10:14.180160  8771 heartbeater.cc:507] Master 127.8.22.190:38971 requested a full tablet report, sending...
I20251028 09:10:14.180089  9089 catalog_manager.cc:5649] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 reported cstate change: config changed from index -1 to 5528, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New cstate: current_term: 1 leader_uuid: "5cf59e115bd641f8965613ab616c9b85" committed_config { opid_index: 5528 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20251028 09:10:14.183086  8528 consensus_peers.cc:597] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85 -> Peer 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132:34091): Couldn't send request to peer 19ef51fd6f6b47869b63ac0dff2bc4f4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 9e00a8dbf00b4468bbbfc193e444299e. This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:14.183174  8528 consensus_peers.cc:597] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85 -> Peer 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132:34091): Couldn't send request to peer 19ef51fd6f6b47869b63ac0dff2bc4f4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 61b94d4520cd4af999e8e38b3a475ac4. This is attempt 1: this message will repeat every 5th retry.
I20251028 09:10:14.185560  8723 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 24 messages since previous log ~9 seconds ago
I20251028 09:10:14.185496  8725 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 1 messages since previous log ~6 seconds ago
I20251028 09:10:14.185497  8721 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 24 messages since previous log ~9 seconds ago
I20251028 09:10:14.185680  8721 consensus_queue.cc:237] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5529, Committed index: 5529, Last appended: 2.5533, Last appended by leader: 169, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5534 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } }
I20251028 09:10:14.185787  8725 consensus_queue.cc:237] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5526, Committed index: 5526, Last appended: 1.5530, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5531 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } }
I20251028 09:10:14.185498  8724 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 28 messages since previous log ~9 seconds ago
I20251028 09:10:14.185787  8723 consensus_queue.cc:237] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5528, Committed index: 5528, Last appended: 3.5532, Last appended by leader: 169, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5533 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } }
I20251028 09:10:14.186105  8724 consensus_queue.cc:237] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5529, Committed index: 5529, Last appended: 2.5533, Last appended by leader: 169, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5534 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } }
I20251028 09:10:14.187161  8593 raft_consensus.cc:1275] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Refusing update from remote peer a73a6d78104342d7bfeb363acfac482b: Log matching property violated. Preceding OpId in replica: term: 2 index: 5530. Preceding OpId from leader: term: 2 index: 5534. (index mismatch)
I20251028 09:10:14.187372  8593 raft_consensus.cc:1275] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Refusing update from remote peer a73a6d78104342d7bfeb363acfac482b: Log matching property violated. Preceding OpId in replica: term: 1 index: 5527. Preceding OpId from leader: term: 1 index: 5531. (index mismatch)
I20251028 09:10:14.187541  8593 raft_consensus.cc:1275] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Refusing update from remote peer a73a6d78104342d7bfeb363acfac482b: Log matching property violated. Preceding OpId in replica: term: 2 index: 5531. Preceding OpId from leader: term: 2 index: 5534. (index mismatch)
I20251028 09:10:14.187672  8593 raft_consensus.cc:1275] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 3 FOLLOWER]: Refusing update from remote peer a73a6d78104342d7bfeb363acfac482b: Log matching property violated. Preceding OpId in replica: term: 3 index: 5528. Preceding OpId from leader: term: 3 index: 5533. (index mismatch)
I20251028 09:10:14.187958  9352 consensus_queue.cc:1048] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [LEADER]: Connected to new peer: Peer: permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5534, Last known committed idx: 5529, Time since last communication: 0.000s
I20251028 09:10:14.187928  9378 consensus_queue.cc:1048] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [LEADER]: Connected to new peer: Peer: permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5534, Last known committed idx: 5529, Time since last communication: 0.000s
I20251028 09:10:14.187928  9376 consensus_queue.cc:1048] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Connected to new peer: Peer: permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5531, Last known committed idx: 5526, Time since last communication: 0.000s
I20251028 09:10:14.188038  9357 consensus_queue.cc:1048] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [LEADER]: Connected to new peer: Peer: permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5533, Last known committed idx: 5528, Time since last communication: 0.000s
W20251028 09:10:14.188443  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:14.188498  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:14.188520  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:14.188541  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547): Couldn't send request to peer 3d8ce816521247e18c0d5b9e130edc21. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251028 09:10:14.189520  9376 raft_consensus.cc:2955] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b [term 1 LEADER]: Committing config change with OpId 1.5531: config changed from index -1 to 5531, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5531 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
I20251028 09:10:14.189570  9352 raft_consensus.cc:2955] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b [term 2 LEADER]: Committing config change with OpId 2.5534: config changed from index -1 to 5534, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5534 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
I20251028 09:10:14.189626  8593 raft_consensus.cc:2955] T 189fdd871f304eb888387f1f15973390 P 5cf59e115bd641f8965613ab616c9b85 [term 1 FOLLOWER]: Committing config change with OpId 1.5531: config changed from index -1 to 5531, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5531 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
I20251028 09:10:14.190599  8593 raft_consensus.cc:2955] T a208b359e1a44c018a8ad71c60ff8fed P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Committing config change with OpId 2.5534: config changed from index -1 to 5534, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5534 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
I20251028 09:10:14.190922  9074 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet a208b359e1a44c018a8ad71c60ff8fed with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251028 09:10:14.191366  9074 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 189fdd871f304eb888387f1f15973390 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251028 09:10:14.191305  9090 catalog_manager.cc:5649] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b reported cstate change: config changed from index -1 to 5534, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New cstate: current_term: 2 leader_uuid: "a73a6d78104342d7bfeb363acfac482b" committed_config { opid_index: 5534 OBSOLETE_local: false peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20251028 09:10:14.191447  9090 catalog_manager.cc:5649] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b reported cstate change: config changed from index -1 to 5531, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New cstate: current_term: 1 leader_uuid: "a73a6d78104342d7bfeb363acfac482b" committed_config { opid_index: 5531 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20251028 09:10:14.191680  8659 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132:34091): Couldn't send request to peer 19ef51fd6f6b47869b63ac0dff2bc4f4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 189fdd871f304eb888387f1f15973390. This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:14.191742  8659 consensus_peers.cc:597] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b -> Peer 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132:34091): Couldn't send request to peer 19ef51fd6f6b47869b63ac0dff2bc4f4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: a208b359e1a44c018a8ad71c60ff8fed. This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:14.191779  8659 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132:34091): Couldn't send request to peer 19ef51fd6f6b47869b63ac0dff2bc4f4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: bbe94a484b0c4198bb904ffa7d3ef621. This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:14.191807  8659 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132:34091): Couldn't send request to peer 19ef51fd6f6b47869b63ac0dff2bc4f4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 386ef2b749f24de9aac6947b29d3479c. This is attempt 1: this message will repeat every 5th retry.
I20251028 09:10:14.192097  9352 raft_consensus.cc:2955] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b [term 2 LEADER]: Committing config change with OpId 2.5534: config changed from index -1 to 5534, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5534 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
I20251028 09:10:14.192265  9354 raft_consensus.cc:2955] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b [term 3 LEADER]: Committing config change with OpId 3.5533: config changed from index -1 to 5533, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5533 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
I20251028 09:10:14.193586  8593 raft_consensus.cc:2955] T 386ef2b749f24de9aac6947b29d3479c P 5cf59e115bd641f8965613ab616c9b85 [term 2 FOLLOWER]: Committing config change with OpId 2.5534: config changed from index -1 to 5534, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5534 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
I20251028 09:10:14.196270  9074 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 386ef2b749f24de9aac6947b29d3479c with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251028 09:10:14.196385  9074 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet bbe94a484b0c4198bb904ffa7d3ef621 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251028 09:10:14.197117  8592 raft_consensus.cc:2955] T bbe94a484b0c4198bb904ffa7d3ef621 P 5cf59e115bd641f8965613ab616c9b85 [term 3 FOLLOWER]: Committing config change with OpId 3.5533: config changed from index -1 to 5533, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New config: { opid_index: 5533 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } } }
I20251028 09:10:14.199846  9089 catalog_manager.cc:5649] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b reported cstate change: config changed from index -1 to 5533, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New cstate: current_term: 3 leader_uuid: "a73a6d78104342d7bfeb363acfac482b" committed_config { opid_index: 5533 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20251028 09:10:14.200011  9089 catalog_manager.cc:5649] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b reported cstate change: config changed from index -1 to 5534, NON_VOTER 19ef51fd6f6b47869b63ac0dff2bc4f4 (127.8.22.132) added. New cstate: current_term: 2 leader_uuid: "a73a6d78104342d7bfeb363acfac482b" committed_config { opid_index: 5534 OBSOLETE_local: false peers { permanent_uuid: "3d8ce816521247e18c0d5b9e130edc21" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 41547 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "5cf59e115bd641f8965613ab616c9b85" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 46821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a73a6d78104342d7bfeb363acfac482b" member_type: VOTER last_known_addr { host: "127.8.22.131" port: 38055 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "19ef51fd6f6b47869b63ac0dff2bc4f4" member_type: NON_VOTER last_known_addr { host: "127.8.22.132" port: 34091 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20251028 09:10:14.264190  9410 ts_tablet_manager.cc:933] T 386ef2b749f24de9aac6947b29d3479c P 19ef51fd6f6b47869b63ac0dff2bc4f4: Initiating tablet copy from peer a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:10:14.264540  9410 tablet_copy_client.cc:323] T 386ef2b749f24de9aac6947b29d3479c P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Beginning tablet copy session from remote peer at address 127.8.22.131:38055
I20251028 09:10:14.270756  8745 tablet_copy_service.cc:140] P a73a6d78104342d7bfeb363acfac482b: Received BeginTabletCopySession request for tablet 386ef2b749f24de9aac6947b29d3479c from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 ({username='slave'} at 127.8.22.132:49481)
I20251028 09:10:14.270848  8745 tablet_copy_service.cc:161] P a73a6d78104342d7bfeb363acfac482b: Beginning new tablet copy session on tablet 386ef2b749f24de9aac6947b29d3479c from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 at {username='slave'} at 127.8.22.132:49481: session id = 19ef51fd6f6b47869b63ac0dff2bc4f4-386ef2b749f24de9aac6947b29d3479c
I20251028 09:10:14.271548  8745 tablet_copy_source_session.cc:215] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b: Tablet Copy: opened 0 blocks and 1 log segments
I20251028 09:10:14.273587  9410 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 386ef2b749f24de9aac6947b29d3479c. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:10:14.273777  9409 ts_tablet_manager.cc:933] T 189fdd871f304eb888387f1f15973390 P 19ef51fd6f6b47869b63ac0dff2bc4f4: Initiating tablet copy from peer a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:10:14.273962  9409 tablet_copy_client.cc:323] T 189fdd871f304eb888387f1f15973390 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Beginning tablet copy session from remote peer at address 127.8.22.131:38055
I20251028 09:10:14.274178  8745 tablet_copy_service.cc:140] P a73a6d78104342d7bfeb363acfac482b: Received BeginTabletCopySession request for tablet 189fdd871f304eb888387f1f15973390 from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 ({username='slave'} at 127.8.22.132:49481)
I20251028 09:10:14.274241  8745 tablet_copy_service.cc:161] P a73a6d78104342d7bfeb363acfac482b: Beginning new tablet copy session on tablet 189fdd871f304eb888387f1f15973390 from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 at {username='slave'} at 127.8.22.132:49481: session id = 19ef51fd6f6b47869b63ac0dff2bc4f4-189fdd871f304eb888387f1f15973390
I20251028 09:10:14.274478  9413 ts_tablet_manager.cc:933] T bbe94a484b0c4198bb904ffa7d3ef621 P 19ef51fd6f6b47869b63ac0dff2bc4f4: Initiating tablet copy from peer a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:10:14.274626  9413 tablet_copy_client.cc:323] T bbe94a484b0c4198bb904ffa7d3ef621 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Beginning tablet copy session from remote peer at address 127.8.22.131:38055
I20251028 09:10:14.274758  8745 tablet_copy_source_session.cc:215] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b: Tablet Copy: opened 0 blocks and 1 log segments
I20251028 09:10:14.274999  8745 tablet_copy_service.cc:140] P a73a6d78104342d7bfeb363acfac482b: Received BeginTabletCopySession request for tablet bbe94a484b0c4198bb904ffa7d3ef621 from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 ({username='slave'} at 127.8.22.132:49481)
I20251028 09:10:14.275041  8745 tablet_copy_service.cc:161] P a73a6d78104342d7bfeb363acfac482b: Beginning new tablet copy session on tablet bbe94a484b0c4198bb904ffa7d3ef621 from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 at {username='slave'} at 127.8.22.132:49481: session id = 19ef51fd6f6b47869b63ac0dff2bc4f4-bbe94a484b0c4198bb904ffa7d3ef621
I20251028 09:10:14.275094  9409 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 189fdd871f304eb888387f1f15973390. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:10:14.275491  8745 tablet_copy_source_session.cc:215] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b: Tablet Copy: opened 0 blocks and 1 log segments
I20251028 09:10:14.276258  9413 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet bbe94a484b0c4198bb904ffa7d3ef621. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:10:14.276528  9410 tablet_copy_client.cc:806] T 386ef2b749f24de9aac6947b29d3479c P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 0 data blocks...
I20251028 09:10:14.276691  9410 tablet_copy_client.cc:670] T 386ef2b749f24de9aac6947b29d3479c P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 1 WAL segments...
I20251028 09:10:14.277483  9413 tablet_copy_client.cc:806] T bbe94a484b0c4198bb904ffa7d3ef621 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 0 data blocks...
I20251028 09:10:14.277576  9409 tablet_copy_client.cc:806] T 189fdd871f304eb888387f1f15973390 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 0 data blocks...
I20251028 09:10:14.277786  9409 tablet_copy_client.cc:670] T 189fdd871f304eb888387f1f15973390 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 1 WAL segments...
I20251028 09:10:14.277837  9413 tablet_copy_client.cc:670] T bbe94a484b0c4198bb904ffa7d3ef621 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 1 WAL segments...
I20251028 09:10:14.280064  9416 ts_tablet_manager.cc:933] T 9e00a8dbf00b4468bbbfc193e444299e P 19ef51fd6f6b47869b63ac0dff2bc4f4: Initiating tablet copy from peer 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:10:14.280506  9416 tablet_copy_client.cc:323] T 9e00a8dbf00b4468bbbfc193e444299e P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Beginning tablet copy session from remote peer at address 127.8.22.130:46821
I20251028 09:10:14.284340  8614 tablet_copy_service.cc:140] P 5cf59e115bd641f8965613ab616c9b85: Received BeginTabletCopySession request for tablet 9e00a8dbf00b4468bbbfc193e444299e from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 ({username='slave'} at 127.8.22.132:56711)
I20251028 09:10:14.284437  8614 tablet_copy_service.cc:161] P 5cf59e115bd641f8965613ab616c9b85: Beginning new tablet copy session on tablet 9e00a8dbf00b4468bbbfc193e444299e from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 at {username='slave'} at 127.8.22.132:56711: session id = 19ef51fd6f6b47869b63ac0dff2bc4f4-9e00a8dbf00b4468bbbfc193e444299e
I20251028 09:10:14.285043  8614 tablet_copy_source_session.cc:215] T 9e00a8dbf00b4468bbbfc193e444299e P 5cf59e115bd641f8965613ab616c9b85: Tablet Copy: opened 0 blocks and 1 log segments
I20251028 09:10:14.285663  9416 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9e00a8dbf00b4468bbbfc193e444299e. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:10:14.287024  9416 tablet_copy_client.cc:806] T 9e00a8dbf00b4468bbbfc193e444299e P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 0 data blocks...
I20251028 09:10:14.287149  9416 tablet_copy_client.cc:670] T 9e00a8dbf00b4468bbbfc193e444299e P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 1 WAL segments...
I20251028 09:10:14.323652  9410 tablet_copy_client.cc:538] T 386ef2b749f24de9aac6947b29d3479c P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251028 09:10:14.324865  9419 ts_tablet_manager.cc:933] T a208b359e1a44c018a8ad71c60ff8fed P 19ef51fd6f6b47869b63ac0dff2bc4f4: Initiating tablet copy from peer a73a6d78104342d7bfeb363acfac482b (127.8.22.131:38055)
I20251028 09:10:14.325021  9413 tablet_copy_client.cc:538] T bbe94a484b0c4198bb904ffa7d3ef621 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251028 09:10:14.326149  9413 tablet_bootstrap.cc:492] T bbe94a484b0c4198bb904ffa7d3ef621 P 19ef51fd6f6b47869b63ac0dff2bc4f4: Bootstrap starting.
I20251028 09:10:14.325255  9410 tablet_bootstrap.cc:492] T 386ef2b749f24de9aac6947b29d3479c P 19ef51fd6f6b47869b63ac0dff2bc4f4: Bootstrap starting.
I20251028 09:10:14.325311  9418 ts_tablet_manager.cc:933] T 61b94d4520cd4af999e8e38b3a475ac4 P 19ef51fd6f6b47869b63ac0dff2bc4f4: Initiating tablet copy from peer 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821)
I20251028 09:10:14.340335  9419 tablet_copy_client.cc:323] T a208b359e1a44c018a8ad71c60ff8fed P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Beginning tablet copy session from remote peer at address 127.8.22.131:38055
I20251028 09:10:14.341140  9418 tablet_copy_client.cc:323] T 61b94d4520cd4af999e8e38b3a475ac4 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Beginning tablet copy session from remote peer at address 127.8.22.130:46821
I20251028 09:10:14.341536  8614 tablet_copy_service.cc:140] P 5cf59e115bd641f8965613ab616c9b85: Received BeginTabletCopySession request for tablet 61b94d4520cd4af999e8e38b3a475ac4 from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 ({username='slave'} at 127.8.22.132:56711)
I20251028 09:10:14.341709  8614 tablet_copy_service.cc:161] P 5cf59e115bd641f8965613ab616c9b85: Beginning new tablet copy session on tablet 61b94d4520cd4af999e8e38b3a475ac4 from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 at {username='slave'} at 127.8.22.132:56711: session id = 19ef51fd6f6b47869b63ac0dff2bc4f4-61b94d4520cd4af999e8e38b3a475ac4
I20251028 09:10:14.341874  9416 tablet_copy_client.cc:538] T 9e00a8dbf00b4468bbbfc193e444299e P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251028 09:10:14.342696  8614 tablet_copy_source_session.cc:215] T 61b94d4520cd4af999e8e38b3a475ac4 P 5cf59e115bd641f8965613ab616c9b85: Tablet Copy: opened 0 blocks and 1 log segments
I20251028 09:10:14.342890  9416 tablet_bootstrap.cc:492] T 9e00a8dbf00b4468bbbfc193e444299e P 19ef51fd6f6b47869b63ac0dff2bc4f4: Bootstrap starting.
I20251028 09:10:14.343057  9418 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 61b94d4520cd4af999e8e38b3a475ac4. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:10:14.348333  9418 tablet_copy_client.cc:806] T 61b94d4520cd4af999e8e38b3a475ac4 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 0 data blocks...
I20251028 09:10:14.350679  9409 tablet_copy_client.cc:538] T 189fdd871f304eb888387f1f15973390 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251028 09:10:14.355505  8744 tablet_copy_service.cc:140] P a73a6d78104342d7bfeb363acfac482b: Received BeginTabletCopySession request for tablet a208b359e1a44c018a8ad71c60ff8fed from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 ({username='slave'} at 127.8.22.132:49481)
I20251028 09:10:14.355670  8744 tablet_copy_service.cc:161] P a73a6d78104342d7bfeb363acfac482b: Beginning new tablet copy session on tablet a208b359e1a44c018a8ad71c60ff8fed from peer 19ef51fd6f6b47869b63ac0dff2bc4f4 at {username='slave'} at 127.8.22.132:49481: session id = 19ef51fd6f6b47869b63ac0dff2bc4f4-a208b359e1a44c018a8ad71c60ff8fed
I20251028 09:10:14.356328  8744 tablet_copy_source_session.cc:215] T a208b359e1a44c018a8ad71c60ff8fed P a73a6d78104342d7bfeb363acfac482b: Tablet Copy: opened 0 blocks and 1 log segments
I20251028 09:10:14.352563  9409 tablet_bootstrap.cc:492] T 189fdd871f304eb888387f1f15973390 P 19ef51fd6f6b47869b63ac0dff2bc4f4: Bootstrap starting.
I20251028 09:10:14.357640  9419 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a208b359e1a44c018a8ad71c60ff8fed. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:10:14.359035  9419 tablet_copy_client.cc:806] T a208b359e1a44c018a8ad71c60ff8fed P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 0 data blocks...
I20251028 09:10:14.359158  9419 tablet_copy_client.cc:670] T a208b359e1a44c018a8ad71c60ff8fed P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 1 WAL segments...
I20251028 09:10:14.351795  9418 tablet_copy_client.cc:670] T 61b94d4520cd4af999e8e38b3a475ac4 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Starting download of 1 WAL segments...
I20251028 09:10:14.379009  9419 tablet_copy_client.cc:538] T a208b359e1a44c018a8ad71c60ff8fed P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251028 09:10:14.380204  9419 tablet_bootstrap.cc:492] T a208b359e1a44c018a8ad71c60ff8fed P 19ef51fd6f6b47869b63ac0dff2bc4f4: Bootstrap starting.
I20251028 09:10:14.382508  9418 tablet_copy_client.cc:538] T 61b94d4520cd4af999e8e38b3a475ac4 P 19ef51fd6f6b47869b63ac0dff2bc4f4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251028 09:10:14.384258  9418 tablet_bootstrap.cc:492] T 61b94d4520cd4af999e8e38b3a475ac4 P 19ef51fd6f6b47869b63ac0dff2bc4f4: Bootstrap starting.
I20251028 09:10:14.475512  8282 meta_cache.cc:1510] marking tablet server 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547) as failed
W20251028 09:10:14.475613  8282 meta_cache.cc:302] tablet 386ef2b749f24de9aac6947b29d3479c: replica 3d8ce816521247e18c0d5b9e130edc21 (127.8.22.129:41547) has failed: Network error: TS failed: Client connection negotiation failed: client connection to 127.8.22.129:41547: connect: Connection refused (error 111)
I20251028 09:10:14.524842  9416 log.cc:826] T 9e00a8dbf00b4468bbbfc193e444299e P 19ef51fd6f6b47869b63ac0dff2bc4f4: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:14.618392  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 8512
W20251028 09:10:14.654281  8899 connection.cc:537] server connection from 127.8.22.130:55947 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251028 09:10:14.654637  8661 connection.cc:537] client connection to 127.8.22.130:46821 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251028 09:10:14.655128  8661 consensus_peers.cc:597] T 386ef2b749f24de9aac6947b29d3479c P a73a6d78104342d7bfeb363acfac482b -> Peer 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821): Couldn't send request to peer 5cf59e115bd641f8965613ab616c9b85. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.130:46821: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251028 09:10:14.655177  8661 consensus_peers.cc:597] T 189fdd871f304eb888387f1f15973390 P a73a6d78104342d7bfeb363acfac482b -> Peer 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821): Couldn't send request to peer 5cf59e115bd641f8965613ab616c9b85. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.130:46821: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251028 09:10:14.655166  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 8643
W20251028 09:10:14.655201  8661 consensus_peers.cc:597] T bbe94a484b0c4198bb904ffa7d3ef621 P a73a6d78104342d7bfeb363acfac482b -> Peer 5cf59e115bd641f8965613ab616c9b85 (127.8.22.130:46821): Couldn't send request to peer 5cf59e115bd641f8965613ab616c9b85. Status: Network error: Client connection negotiation failed: client connection to 127.8.22.130:46821: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251028 09:10:14.683231  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 8830
I20251028 09:10:14.689744  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 9059
2025-10-28T09:10:14Z chronyd exiting
[       OK ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate (15683 ms)
[----------] 1 test from MaintenanceModeRF3ITest (15684 ms total)
[----------] 1 test from RollingRestartArgs/RollingRestartITest
[ RUN      ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4
2025-10-28T09:10:14Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-10-28T09:10:14Z Disabled control of system clock
I20251028 09:10:14.738444  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.8.22.190:37123
--webserver_interface=127.8.22.190
--webserver_port=0
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.8.22.190:37123
--location_mapping_cmd=/tmp/dist-test-task6m1lU8/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/location-assignment.state --map /L0:4
--master_client_location_assignment_enabled=false with env {}
W20251028 09:10:14.818552  9447 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:14.818745  9447 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:14.818764  9447 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:14.820333  9447 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20251028 09:10:14.820394  9447 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:14.820408  9447 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20251028 09:10:14.820420  9447 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20251028 09:10:14.822032  9447 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/wal
--location_mapping_cmd=/tmp/dist-test-task6m1lU8/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/location-assignment.state --map /L0:4
--ipki_ca_key_size=768
--master_addresses=127.8.22.190:37123
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.8.22.190:37123
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.8.22.190
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.9447
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:14.822280  9447 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:14.822535  9447 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:14.825165  9453 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:14.825233  9455 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:14.825165  9452 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:14.825456  9447 server_base.cc:1047] running on GCE node
I20251028 09:10:14.825678  9447 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:14.825927  9447 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:14.827044  9447 hybrid_clock.cc:648] HybridClock initialized: now 1761642614827025 us; error 36 us; skew 500 ppm
I20251028 09:10:14.828444  9447 webserver.cc:492] Webserver started at http://127.8.22.190:34863/ using document root <none> and password file <none>
I20251028 09:10:14.828670  9447 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:14.828725  9447 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:14.828851  9447 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251028 09:10:14.829809  9447 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/data/instance:
uuid: "69f8c5efca4d447697bf11380b9957b2"
format_stamp: "Formatted at 2025-10-28 09:10:14 on dist-test-slave-kqwd"
I20251028 09:10:14.830157  9447 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/wal/instance:
uuid: "69f8c5efca4d447697bf11380b9957b2"
format_stamp: "Formatted at 2025-10-28 09:10:14 on dist-test-slave-kqwd"
I20251028 09:10:14.831570  9447 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.003s	sys 0.000s
I20251028 09:10:14.832383  9461 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:14.832598  9447 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251028 09:10:14.832679  9447 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/wal
uuid: "69f8c5efca4d447697bf11380b9957b2"
format_stamp: "Formatted at 2025-10-28 09:10:14 on dist-test-slave-kqwd"
I20251028 09:10:14.832744  9447 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:14.846094  9447 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:14.846390  9447 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:14.846518  9447 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:14.850642  9447 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.190:37123
I20251028 09:10:14.850725  9513 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.190:37123 every 8 connection(s)
I20251028 09:10:14.851091  9447 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/data/info.pb
I20251028 09:10:14.851788  9514 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:10:14.852519  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 9447
I20251028 09:10:14.852624  8282 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/master-0/wal/instance
I20251028 09:10:14.854493  9514 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2: Bootstrap starting.
I20251028 09:10:14.855221  9514 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2: Neither blocks nor log segments found. Creating new log.
I20251028 09:10:14.855540  9514 log.cc:826] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:14.856216  9514 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2: No bootstrap required, opened a new log
I20251028 09:10:14.857687  9514 raft_consensus.cc:359] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "69f8c5efca4d447697bf11380b9957b2" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 37123 } }
I20251028 09:10:14.857851  9514 raft_consensus.cc:385] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:10:14.857885  9514 raft_consensus.cc:740] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 69f8c5efca4d447697bf11380b9957b2, State: Initialized, Role: FOLLOWER
I20251028 09:10:14.858006  9514 consensus_queue.cc:260] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "69f8c5efca4d447697bf11380b9957b2" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 37123 } }
I20251028 09:10:14.858084  9514 raft_consensus.cc:399] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20251028 09:10:14.858129  9514 raft_consensus.cc:493] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20251028 09:10:14.858184  9514 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:10:14.858899  9514 raft_consensus.cc:515] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "69f8c5efca4d447697bf11380b9957b2" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 37123 } }
I20251028 09:10:14.859052  9514 leader_election.cc:304] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 69f8c5efca4d447697bf11380b9957b2; no voters: 
I20251028 09:10:14.859217  9514 leader_election.cc:290] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20251028 09:10:14.859268  9519 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [term 1 FOLLOWER]: Leader election won for term 1
I20251028 09:10:14.859370  9519 raft_consensus.cc:697] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [term 1 LEADER]: Becoming Leader. State: Replica: 69f8c5efca4d447697bf11380b9957b2, State: Running, Role: LEADER
I20251028 09:10:14.859449  9519 consensus_queue.cc:237] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "69f8c5efca4d447697bf11380b9957b2" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 37123 } }
I20251028 09:10:14.859476  9514 sys_catalog.cc:565] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [sys.catalog]: configured and running, proceeding with master startup.
I20251028 09:10:14.859776  9521 sys_catalog.cc:455] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 69f8c5efca4d447697bf11380b9957b2. Latest consensus state: current_term: 1 leader_uuid: "69f8c5efca4d447697bf11380b9957b2" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "69f8c5efca4d447697bf11380b9957b2" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 37123 } } }
I20251028 09:10:14.859992  9521 sys_catalog.cc:458] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [sys.catalog]: This master's current role is: LEADER
I20251028 09:10:14.860262  9520 sys_catalog.cc:455] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "69f8c5efca4d447697bf11380b9957b2" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "69f8c5efca4d447697bf11380b9957b2" member_type: VOTER last_known_addr { host: "127.8.22.190" port: 37123 } } }
I20251028 09:10:14.860311  9526 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20251028 09:10:14.860356  9520 sys_catalog.cc:458] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2 [sys.catalog]: This master's current role is: LEADER
I20251028 09:10:14.860800  9526 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20251028 09:10:14.862283  9526 catalog_manager.cc:1357] Generated new cluster ID: 4788fcebbf654588877d305a5e95cfac
I20251028 09:10:14.862341  9526 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20251028 09:10:14.893712  9526 catalog_manager.cc:1380] Generated new certificate authority record
I20251028 09:10:14.894492  9526 catalog_manager.cc:1514] Loading token signing keys...
I20251028 09:10:14.904487  9526 catalog_manager.cc:6022] T 00000000000000000000000000000000 P 69f8c5efca4d447697bf11380b9957b2: Generated new TSK 0
I20251028 09:10:14.904719  9526 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20251028 09:10:14.915400  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.129:0
--local_ip_for_outbound_sockets=127.8.22.129
--webserver_interface=127.8.22.129
--webserver_port=0
--tserver_master_addrs=127.8.22.190:37123
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251028 09:10:14.998121  9538 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:14.998313  9538 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:14.998337  9538 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:14.999854  9538 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:14.999922  9538 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.129
I20251028 09:10:15.001508  9538 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.129:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.8.22.129
--webserver_port=0
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.9538
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.129
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:15.001722  9538 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:15.001969  9538 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:15.004813  9543 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:15.004822  9546 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:15.004822  9544 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:15.005151  9538 server_base.cc:1047] running on GCE node
I20251028 09:10:15.005340  9538 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:15.005657  9538 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:15.006851  9538 hybrid_clock.cc:648] HybridClock initialized: now 1761642615006843 us; error 30 us; skew 500 ppm
I20251028 09:10:15.008253  9538 webserver.cc:492] Webserver started at http://127.8.22.129:38771/ using document root <none> and password file <none>
I20251028 09:10:15.008555  9538 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:15.008621  9538 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:15.008733  9538 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251028 09:10:15.009603  9538 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/instance:
uuid: "817e1b13d456460eb9915890dd578911"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.009928  9538 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal/instance:
uuid: "817e1b13d456460eb9915890dd578911"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.011140  9538 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.003s	sys 0.000s
I20251028 09:10:15.011922  9552 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.012145  9538 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251028 09:10:15.012241  9538 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
uuid: "817e1b13d456460eb9915890dd578911"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.012319  9538 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:15.027007  9538 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:15.027341  9538 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:15.027597  9538 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:15.027917  9538 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:15.028330  9538 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:10:15.028378  9538 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.028409  9538 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:10:15.028425  9538 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.035707  9538 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.129:39271
I20251028 09:10:15.035806  9665 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.129:39271 every 8 connection(s)
I20251028 09:10:15.036154  9538 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
I20251028 09:10:15.040529  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 9538
I20251028 09:10:15.040647  8282 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal/instance
I20251028 09:10:15.041692  9666 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:15.041807  9666 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:15.041997  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.130:0
--local_ip_for_outbound_sockets=127.8.22.130
--webserver_interface=127.8.22.130
--webserver_port=0
--tserver_master_addrs=127.8.22.190:37123
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:15.042094  9666 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:15.088258  9478 ts_manager.cc:194] Registered new tserver with Master: 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271)
I20251028 09:10:15.089016  9478 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.129:44553
W20251028 09:10:15.127969  9669 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:15.128163  9669 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:15.128191  9669 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:15.129696  9669 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:15.129752  9669 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.130
I20251028 09:10:15.131273  9669 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.130:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.8.22.130
--webserver_port=0
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.9669
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.130
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:15.131493  9669 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:15.131718  9669 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:15.134348  9678 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:15.134362  9675 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:15.134444  9676 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:15.134654  9669 server_base.cc:1047] running on GCE node
I20251028 09:10:15.134814  9669 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:15.135061  9669 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:15.136229  9669 hybrid_clock.cc:648] HybridClock initialized: now 1761642615136200 us; error 47 us; skew 500 ppm
I20251028 09:10:15.137514  9669 webserver.cc:492] Webserver started at http://127.8.22.130:36451/ using document root <none> and password file <none>
I20251028 09:10:15.137696  9669 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:15.137732  9669 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:15.137822  9669 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251028 09:10:15.138692  9669 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/instance:
uuid: "a10940ed9d79478faf0dac2c7b960184"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.138998  9669 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal/instance:
uuid: "a10940ed9d79478faf0dac2c7b960184"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.140201  9669 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:15.140964  9684 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.141196  9669 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:15.141320  9669 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
uuid: "a10940ed9d79478faf0dac2c7b960184"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.141381  9669 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:15.151162  9669 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:15.151466  9669 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:15.151590  9669 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:15.151819  9669 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:15.152171  9669 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:10:15.152206  9669 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.152243  9669 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:10:15.152263  9669 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.158133  9669 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.130:41397
I20251028 09:10:15.158265  9797 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.130:41397 every 8 connection(s)
I20251028 09:10:15.158535  9669 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
I20251028 09:10:15.159184  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 9669
I20251028 09:10:15.159304  8282 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal/instance
I20251028 09:10:15.160511  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.131:0
--local_ip_for_outbound_sockets=127.8.22.131
--webserver_interface=127.8.22.131
--webserver_port=0
--tserver_master_addrs=127.8.22.190:37123
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:15.164131  9798 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:15.164243  9798 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:15.164438  9798 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:15.199139  9478 ts_manager.cc:194] Registered new tserver with Master: a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:15.199865  9478 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.130:43873
W20251028 09:10:15.244201  9801 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:15.244381  9801 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:15.244402  9801 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:15.245957  9801 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:15.246019  9801 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.131
I20251028 09:10:15.247593  9801 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.131:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.8.22.131
--webserver_port=0
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.9801
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.131
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:15.247781  9801 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:15.247982  9801 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:15.250643  9810 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:15.250756  9801 server_base.cc:1047] running on GCE node
W20251028 09:10:15.250648  9808 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:15.250646  9807 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:15.251116  9801 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:15.251336  9801 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:15.252481  9801 hybrid_clock.cc:648] HybridClock initialized: now 1761642615252465 us; error 36 us; skew 500 ppm
I20251028 09:10:15.253611  9801 webserver.cc:492] Webserver started at http://127.8.22.131:36027/ using document root <none> and password file <none>
I20251028 09:10:15.253805  9801 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:15.253850  9801 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:15.253968  9801 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251028 09:10:15.254835  9801 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/instance:
uuid: "bdeedf35d61a49bda10c70688541a0ea"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.255157  9801 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal/instance:
uuid: "bdeedf35d61a49bda10c70688541a0ea"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.256379  9801 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:15.257290  9816 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.257444  9801 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20251028 09:10:15.257515  9801 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
uuid: "bdeedf35d61a49bda10c70688541a0ea"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.257578  9801 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:15.282941  9801 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:15.283236  9801 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:15.283349  9801 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:15.283576  9801 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:15.283905  9801 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:10:15.283938  9801 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.283977  9801 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:10:15.284006  9801 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.289528  9801 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.131:39559
I20251028 09:10:15.289605  9929 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.131:39559 every 8 connection(s)
I20251028 09:10:15.289870  9801 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
I20251028 09:10:15.294206  9930 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:15.294299  9930 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:15.294485  9930 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:15.295794  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 9801
I20251028 09:10:15.295878  8282 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal/instance
I20251028 09:10:15.297792  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.132:0
--local_ip_for_outbound_sockets=127.8.22.132
--webserver_interface=127.8.22.132
--webserver_port=0
--tserver_master_addrs=127.8.22.190:37123
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:15.328431  9478 ts_manager.cc:194] Registered new tserver with Master: bdeedf35d61a49bda10c70688541a0ea (127.8.22.131:39559)
I20251028 09:10:15.328928  9478 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.131:45719
W20251028 09:10:15.381157  9934 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:15.381366  9934 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:15.381403  9934 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:15.382830  9934 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:15.382898  9934 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.132
I20251028 09:10:15.384428  9934 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.132:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.8.22.132
--webserver_port=0
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.9934
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.132
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:15.384691  9934 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:15.384920  9934 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:15.387477  9940 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:15.387542  9942 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:15.387511  9939 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:15.387636  9934 server_base.cc:1047] running on GCE node
I20251028 09:10:15.387859  9934 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:15.388029  9934 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:15.389155  9934 hybrid_clock.cc:648] HybridClock initialized: now 1761642615389145 us; error 30 us; skew 500 ppm
I20251028 09:10:15.390286  9934 webserver.cc:492] Webserver started at http://127.8.22.132:36911/ using document root <none> and password file <none>
I20251028 09:10:15.390498  9934 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:15.390542  9934 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:15.390645  9934 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251028 09:10:15.391594  9934 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/instance:
uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.391898  9934 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal/instance:
uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.393050  9934 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.000s
I20251028 09:10:15.393817  9948 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.394021  9934 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:15.394093  9934 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:15.394146  9934 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:15.435981  9934 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:15.436254  9934 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:15.436368  9934 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:15.436604  9934 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:15.436925  9934 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:10:15.436959  9934 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.436981  9934 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:10:15.436995  9934 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:15.442754  9934 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.132:43959
I20251028 09:10:15.442845 10061 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.132:43959 every 8 connection(s)
I20251028 09:10:15.443150  9934 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
I20251028 09:10:15.446074  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 9934
I20251028 09:10:15.446177  8282 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal/instance
I20251028 09:10:15.447690 10062 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:15.447788 10062 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:15.447947 10062 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:15.481114  9478 ts_manager.cc:194] Registered new tserver with Master: 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:15.481613  9478 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.132:60233
I20251028 09:10:15.491102  8282 external_mini_cluster.cc:949] 4 TS(s) registered with all masters
I20251028 09:10:15.496906  8282 test_util.cc:276] Using random seed: 1649500360
I20251028 09:10:15.503516  9478 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:36440:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
I20251028 09:10:15.510439  9732 tablet_service.cc:1505] Processing CreateTablet for tablet cd9b9f67e7d142db81a1c8be59070ef2 (DEFAULT_TABLE table=test-workload [id=1234e5dd31404e8c83ac25dbeb071334]), partition=RANGE (key) PARTITION UNBOUNDED
I20251028 09:10:15.510617  9600 tablet_service.cc:1505] Processing CreateTablet for tablet cd9b9f67e7d142db81a1c8be59070ef2 (DEFAULT_TABLE table=test-workload [id=1234e5dd31404e8c83ac25dbeb071334]), partition=RANGE (key) PARTITION UNBOUNDED
I20251028 09:10:15.510797  9732 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet cd9b9f67e7d142db81a1c8be59070ef2. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:10:15.510911  9600 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet cd9b9f67e7d142db81a1c8be59070ef2. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:10:15.512535  9996 tablet_service.cc:1505] Processing CreateTablet for tablet cd9b9f67e7d142db81a1c8be59070ef2 (DEFAULT_TABLE table=test-workload [id=1234e5dd31404e8c83ac25dbeb071334]), partition=RANGE (key) PARTITION UNBOUNDED
I20251028 09:10:15.512825  9996 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet cd9b9f67e7d142db81a1c8be59070ef2. 1 dirs total, 0 dirs full, 0 dirs failed
I20251028 09:10:15.513437 10085 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap starting.
I20251028 09:10:15.513733 10086 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap starting.
I20251028 09:10:15.514091 10085 tablet_bootstrap.cc:654] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Neither blocks nor log segments found. Creating new log.
I20251028 09:10:15.514348 10086 tablet_bootstrap.cc:654] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Neither blocks nor log segments found. Creating new log.
I20251028 09:10:15.514369 10085 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:15.514673 10086 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:15.515370 10086 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: No bootstrap required, opened a new log
I20251028 09:10:15.515455 10086 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent bootstrapping tablet: real 0.002s	user 0.002s	sys 0.000s
I20251028 09:10:15.515866 10089 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap starting.
I20251028 09:10:15.516074 10085 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: No bootstrap required, opened a new log
I20251028 09:10:15.516146 10085 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent bootstrapping tablet: real 0.003s	user 0.001s	sys 0.001s
I20251028 09:10:15.516629 10086 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:15.516742 10086 raft_consensus.cc:385] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:10:15.516762 10086 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Initialized, Role: FOLLOWER
I20251028 09:10:15.516847 10089 tablet_bootstrap.cc:654] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Neither blocks nor log segments found. Creating new log.
I20251028 09:10:15.516844 10086 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:15.517086 10086 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251028 09:10:15.517149 10089 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:15.517164  9798 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:15.518177 10085 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:15.518306 10085 raft_consensus.cc:385] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:10:15.518416 10085 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 817e1b13d456460eb9915890dd578911, State: Initialized, Role: FOLLOWER
I20251028 09:10:15.518543 10085 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:15.518837 10085 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent starting tablet: real 0.003s	user 0.000s	sys 0.003s
I20251028 09:10:15.518880  9666 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:15.519436 10089 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: No bootstrap required, opened a new log
I20251028 09:10:15.519524 10089 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent bootstrapping tablet: real 0.004s	user 0.002s	sys 0.000s
I20251028 09:10:15.520684 10089 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:15.520802 10089 raft_consensus.cc:385] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251028 09:10:15.520821 10089 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fedf3ecbec146b6a6ba544511a16fa1, State: Initialized, Role: FOLLOWER
I20251028 09:10:15.520888 10089 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:15.521096 10089 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251028 09:10:15.521158 10062 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:15.526161 10090 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:15.526297 10090 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:15.526592 10090 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:15.530115 10016 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" is_pre_election: true
I20251028 09:10:15.530102  9620 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
I20251028 09:10:15.530283 10016 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 0.
I20251028 09:10:15.530283  9620 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 0.
I20251028 09:10:15.530547  9688 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1, a10940ed9d79478faf0dac2c7b960184; no voters: 
I20251028 09:10:15.530718 10090 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251028 09:10:15.530790 10090 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:10:15.530810 10090 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:10:15.531648 10090 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:15.531792 10090 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 1 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:15.532032  9620 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:15.532032 10016 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
I20251028 09:10:15.532117 10016 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:10:15.532117  9620 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 0 FOLLOWER]: Advancing to term 1
I20251028 09:10:15.532919  9620 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 1.
I20251028 09:10:15.532919 10016 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 1.
I20251028 09:10:15.533139  9688 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1, a10940ed9d79478faf0dac2c7b960184; no voters: 
I20251028 09:10:15.533303 10090 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 1 FOLLOWER]: Leader election won for term 1
I20251028 09:10:15.533560 10090 raft_consensus.cc:697] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 1 LEADER]: Becoming Leader. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Running, Role: LEADER
I20251028 09:10:15.533690 10090 consensus_queue.cc:237] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:15.534505  9478 catalog_manager.cc:5649] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 reported cstate change: term changed from 0 to 1, leader changed from <none> to a10940ed9d79478faf0dac2c7b960184 (127.8.22.130). New cstate: current_term: 1 leader_uuid: "a10940ed9d79478faf0dac2c7b960184" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } health_report { overall_health: UNKNOWN } } }
W20251028 09:10:15.536883  9667 tablet.cc:2378] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20251028 09:10:15.543175  8282 maintenance_mode-itest.cc:745] Restarting batch of 4 tservers: a10940ed9d79478faf0dac2c7b960184,817e1b13d456460eb9915890dd578911,3fedf3ecbec146b6a6ba544511a16fa1,bdeedf35d61a49bda10c70688541a0ea
I20251028 09:10:15.581082 10016 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 1 FOLLOWER]: Refusing update from remote peer a10940ed9d79478faf0dac2c7b960184: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:10:15.581363  9620 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 1 FOLLOWER]: Refusing update from remote peer a10940ed9d79478faf0dac2c7b960184: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251028 09:10:15.581878 10095 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Connected to new peer: Peer: permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:10:15.582362 10095 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251028 09:10:15.594043 10113 mvcc.cc:204] Tried to move back new op lower bound from 7215688153417068544 to 7215688153226481664. Current Snapshot: MvccSnapshot[applied={T|T < 7215688153417068544}]
I20251028 09:10:15.599067 10115 mvcc.cc:204] Tried to move back new op lower bound from 7215688153417068544 to 7215688153226481664. Current Snapshot: MvccSnapshot[applied={T|T < 7215688153417068544}]
I20251028 09:10:15.774310  9476 ts_manager.cc:295] Set tserver state for a10940ed9d79478faf0dac2c7b960184 to MAINTENANCE_MODE
I20251028 09:10:15.801079  9476 ts_manager.cc:295] Set tserver state for bdeedf35d61a49bda10c70688541a0ea to MAINTENANCE_MODE
I20251028 09:10:15.813138  9476 ts_manager.cc:295] Set tserver state for 817e1b13d456460eb9915890dd578911 to MAINTENANCE_MODE
I20251028 09:10:15.813691  9474 ts_manager.cc:295] Set tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 to MAINTENANCE_MODE
I20251028 09:10:16.026131  9600 tablet_service.cc:1460] Tablet server 817e1b13d456460eb9915890dd578911 set to quiescing
I20251028 09:10:16.026198  9600 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:16.098632  9732 tablet_service.cc:1460] Tablet server a10940ed9d79478faf0dac2c7b960184 set to quiescing
I20251028 09:10:16.098703  9732 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:10:16.102036 10133 raft_consensus.cc:993] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: : Instructing follower 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:16.102108 10133 raft_consensus.cc:1081] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 1 LEADER]: Signalling peer 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:16.102874 10015 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2"
dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
 from {username='slave'} at 127.8.22.130:45551
I20251028 09:10:16.103000 10015 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 1 FOLLOWER]: Starting forced leader election (received explicit request)
I20251028 09:10:16.103036 10015 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 1 FOLLOWER]: Advancing to term 2
I20251028 09:10:16.103981 10015 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 2 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:16.104141 10016 raft_consensus.cc:1240] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 2 FOLLOWER]: Rejecting Update request from peer a10940ed9d79478faf0dac2c7b960184 for earlier term 1. Current term is 2. Ops: [1.290-1.291]
I20251028 09:10:16.104250 10015 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 2 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:16.104712 10131 consensus_queue.cc:1059] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 }, Status: INVALID_TERM, Last received: 1.289, Next index: 290, Last known committed idx: 289, Time since last communication: 0.000s
I20251028 09:10:16.104812 10131 raft_consensus.cc:3055] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 1 LEADER]: Stepping down as leader of term 1
I20251028 09:10:16.104836 10131 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 1 LEADER]: Becoming Follower/Learner. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Running, Role: LEADER
I20251028 09:10:16.104883 10131 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 289, Committed index: 289, Last appended: 1.292, Last appended by leader: 292, Current term: 1, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:16.104961 10131 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 1 FOLLOWER]: Advancing to term 2
I20251028 09:10:16.108469  9620 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 2 candidate_status { last_received { term: 1 index: 289 } } ignore_live_leader: true dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:16.108568  9620 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 1 FOLLOWER]: Advancing to term 2
I20251028 09:10:16.109321  9620 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 2 because replica has last-logged OpId of term: 1 index: 292, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 289.
I20251028 09:10:16.110060  9752 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 2 candidate_status { last_received { term: 1 index: 289 } } ignore_live_leader: true dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
I20251028 09:10:16.110183  9752 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 2 because replica has last-logged OpId of term: 1 index: 292, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 289.
I20251028 09:10:16.110399  9950 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1; no voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184
I20251028 09:10:16.110620 10200 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
I20251028 09:10:16.119959  9996 tablet_service.cc:1460] Tablet server 3fedf3ecbec146b6a6ba544511a16fa1 set to quiescing
I20251028 09:10:16.120024  9996 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:16.160159  9864 tablet_service.cc:1460] Tablet server bdeedf35d61a49bda10c70688541a0ea set to quiescing
I20251028 09:10:16.160228  9864 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:16.329849  9930 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
W20251028 09:10:16.346256 10130 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:16.405290 10217 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:16.528905 10200 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: failed to trigger leader election: Illegal state: leader elections are disabled
I20251028 09:10:17.257242  9732 tablet_service.cc:1460] Tablet server a10940ed9d79478faf0dac2c7b960184 set to quiescing
I20251028 09:10:17.257318  9732 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:17.313526  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 9669
W20251028 09:10:17.318394 10072 meta_cache.cc:302] tablet cd9b9f67e7d142db81a1c8be59070ef2: replica a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397) has failed: Network error: recv got EOF from 127.8.22.130:41397 (error 108)
I20251028 09:10:17.318831  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.130:41397
--local_ip_for_outbound_sockets=127.8.22.130
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36451
--webserver_interface=127.8.22.130
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251028 09:10:17.400699 10229 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:17.400871 10229 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:17.400893 10229 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:17.402266 10229 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:17.402319 10229 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.130
I20251028 09:10:17.403816 10229 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.130:41397
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.8.22.130
--webserver_port=36451
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.10229
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.130
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:17.404007 10229 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:17.404193 10229 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:17.406859 10237 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:17.407104 10229 server_base.cc:1047] running on GCE node
W20251028 09:10:17.407025 10234 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:17.406981 10235 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:17.407409 10229 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:17.407610 10229 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:17.408746 10229 hybrid_clock.cc:648] HybridClock initialized: now 1761642617408735 us; error 33 us; skew 500 ppm
I20251028 09:10:17.409865 10229 webserver.cc:492] Webserver started at http://127.8.22.130:36451/ using document root <none> and password file <none>
I20251028 09:10:17.410064 10229 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:17.410120 10229 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:17.411405 10229 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:17.412024 10243 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:17.412177 10229 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251028 09:10:17.412273 10229 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
uuid: "a10940ed9d79478faf0dac2c7b960184"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:17.412531 10229 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:17.427935 10229 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:17.428200 10229 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:17.428320 10229 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:17.428512 10229 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:17.428911 10250 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:17.429747 10229 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:17.429791 10229 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:17.429831 10229 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:17.430341 10229 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:17.430372 10229 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:17.430414 10250 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap starting.
I20251028 09:10:17.437470 10229 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.130:41397
I20251028 09:10:17.437525 10357 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.130:41397 every 8 connection(s)
I20251028 09:10:17.437824 10229 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
I20251028 09:10:17.443543  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 10229
I20251028 09:10:17.443675  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 9538
I20251028 09:10:17.444653 10250 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:17.447126 10358 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:17.447234 10358 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:17.447439 10358 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:17.447986  9474 ts_manager.cc:194] Re-registered known tserver with Master: a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:17.448407  9474 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.130:54113
I20251028 09:10:17.451104  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.129:39271
--local_ip_for_outbound_sockets=127.8.22.129
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=38771
--webserver_interface=127.8.22.129
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251028 09:10:17.455370  9975 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49630: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:17.457348  9975 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49630: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:17.457352  9976 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49630: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:17.480690  9976 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49630: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:17.481484  9976 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49630: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:17.486712  9976 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49630: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:17.496279  9976 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49630: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:17.515266 10250 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 1/1 log segments. Stats: ops{read=292 overwritten=0 applied=289 ignored=0} inserts{seen=14400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251028 09:10:17.515633 10250 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap complete.
I20251028 09:10:17.516934 10250 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent bootstrapping tablet: real 0.087s	user 0.072s	sys 0.011s
I20251028 09:10:17.518121 10250 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 2 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:17.518872 10250 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Initialized, Role: FOLLOWER
I20251028 09:10:17.519014 10250 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 289, Last appended: 1.292, Last appended by leader: 292, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:17.519258 10250 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent starting tablet: real 0.002s	user 0.004s	sys 0.001s
I20251028 09:10:17.519282 10358 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
W20251028 09:10:17.522820  9976 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49630: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:17.526516  9976 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49630: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:17.530009  9976 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49630: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:17.553325 10363 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:17.553622 10363 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:17.553684 10363 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:17.555279 10363 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:17.555351 10363 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.129
I20251028 09:10:17.556982 10363 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.129:39271
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.8.22.129
--webserver_port=38771
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.10363
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.129
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:17.557229 10363 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:17.557472 10363 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:17.560231 10373 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:17.560227 10372 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:17.560472 10363 server_base.cc:1047] running on GCE node
W20251028 09:10:17.560294 10375 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:17.560708 10363 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:17.560946 10363 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:17.562079 10363 hybrid_clock.cc:648] HybridClock initialized: now 1761642617562061 us; error 37 us; skew 500 ppm
I20251028 09:10:17.563251 10363 webserver.cc:492] Webserver started at http://127.8.22.129:38771/ using document root <none> and password file <none>
I20251028 09:10:17.563442 10363 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:17.563498 10363 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:17.564709 10363 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:17.565416 10381 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:17.565635 10363 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:17.565713 10363 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
uuid: "817e1b13d456460eb9915890dd578911"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:17.565986 10363 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:17.592038 10363 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:17.592324 10363 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:17.592439 10363 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:17.592644 10363 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:17.593140 10388 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:17.594024 10363 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:17.594074 10363 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:17.594125 10363 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:17.594668 10363 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:17.594700 10363 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:17.594810 10388 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap starting.
I20251028 09:10:17.601137 10363 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.129:39271
I20251028 09:10:17.601203 10495 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.129:39271 every 8 connection(s)
I20251028 09:10:17.601523 10363 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
I20251028 09:10:17.606882  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 10363
I20251028 09:10:17.607020  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 9934
I20251028 09:10:17.608098 10496 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:17.608212 10496 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:17.608407 10496 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:17.608919  9476 ts_manager.cc:194] Re-registered known tserver with Master: 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271)
I20251028 09:10:17.609324  9476 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.129:53225
I20251028 09:10:17.613135 10388 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:17.616182  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.132:43959
--local_ip_for_outbound_sockets=127.8.22.132
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36911
--webserver_interface=127.8.22.132
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:17.656842 10388 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 1/1 log segments. Stats: ops{read=292 overwritten=0 applied=289 ignored=0} inserts{seen=14400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251028 09:10:17.657253 10388 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap complete.
I20251028 09:10:17.658270 10388 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent bootstrapping tablet: real 0.064s	user 0.048s	sys 0.011s
I20251028 09:10:17.659193 10388 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 2 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:17.659977 10388 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 817e1b13d456460eb9915890dd578911, State: Initialized, Role: FOLLOWER
I20251028 09:10:17.660115 10388 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 289, Last appended: 1.292, Last appended by leader: 292, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:17.660382 10388 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent starting tablet: real 0.002s	user 0.005s	sys 0.000s
I20251028 09:10:17.660472 10496 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
W20251028 09:10:17.698747 10500 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:17.698930 10500 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:17.698974 10500 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:17.700510 10500 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:17.700582 10500 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.132
I20251028 09:10:17.702201 10500 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.132:43959
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.8.22.132
--webserver_port=36911
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.10500
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.132
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:17.702445 10500 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:17.702701 10500 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:17.705412 10510 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:17.705693 10507 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:17.705858 10508 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:17.706058 10500 server_base.cc:1047] running on GCE node
I20251028 09:10:17.706225 10500 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:17.706476 10500 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:17.707610 10500 hybrid_clock.cc:648] HybridClock initialized: now 1761642617707597 us; error 31 us; skew 500 ppm
I20251028 09:10:17.709173 10500 webserver.cc:492] Webserver started at http://127.8.22.132:36911/ using document root <none> and password file <none>
I20251028 09:10:17.709422 10500 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:17.709512 10500 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:17.711361 10500 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.002s
I20251028 09:10:17.712476 10516 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:17.712721 10500 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:17.712815 10500 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:17.713167 10500 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:17.724263 10500 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:17.724654 10500 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:17.724800 10500 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:17.725028 10500 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:17.725544 10523 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:17.726408 10500 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:17.726480 10500 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:17.726569 10500 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:17.727430 10500 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:17.727478 10500 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:17.727555 10523 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap starting.
I20251028 09:10:17.735519 10500 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.132:43959
I20251028 09:10:17.735661 10630 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.132:43959 every 8 connection(s)
I20251028 09:10:17.735960 10500 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
I20251028 09:10:17.741333  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 10500
I20251028 09:10:17.741472  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 9801
I20251028 09:10:17.748348  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.131:39559
--local_ip_for_outbound_sockets=127.8.22.131
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36027
--webserver_interface=127.8.22.131
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:17.748617 10523 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:17.751410 10631 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:17.751557 10631 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:17.751819 10631 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:17.752379  9476 ts_manager.cc:194] Re-registered known tserver with Master: 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:17.753007  9476 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.132:39871
I20251028 09:10:17.788758 10367 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:17.788887 10367 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:17.789187 10367 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:17.794147 10450 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 3 candidate_status { last_received { term: 1 index: 292 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
I20251028 09:10:17.794302 10450 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 2.
I20251028 09:10:17.794723 10244 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184; no voters: 
I20251028 09:10:17.793934 10585 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 3 candidate_status { last_received { term: 1 index: 292 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" is_pre_election: true
I20251028 09:10:17.794836 10367 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 2 FOLLOWER]: Leader pre-election won for term 3
I20251028 09:10:17.794886 10367 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 2 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:10:17.794906 10367 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 2 FOLLOWER]: Advancing to term 3
W20251028 09:10:17.795133 10247 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 3 pre-election: Tablet error from VoteRequest() call to peer 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959): Illegal state: must be running to vote when last-logged opid is not known
I20251028 09:10:17.795964 10367 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 3 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:17.796096 10367 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 3 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:17.796329 10585 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 3 candidate_status { last_received { term: 1 index: 292 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
I20251028 09:10:17.796478 10450 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 3 candidate_status { last_received { term: 1 index: 292 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911"
W20251028 09:10:17.796512 10247 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 3 election: Tablet error from VoteRequest() call to peer 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959): Illegal state: must be running to vote when last-logged opid is not known
I20251028 09:10:17.796563 10450 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 2 FOLLOWER]: Advancing to term 3
I20251028 09:10:17.797413 10450 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 3.
I20251028 09:10:17.797608 10244 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184; no voters: 3fedf3ecbec146b6a6ba544511a16fa1
I20251028 09:10:17.797739 10367 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 3 FOLLOWER]: Leader election won for term 3
I20251028 09:10:17.797899 10367 raft_consensus.cc:697] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 3 LEADER]: Becoming Leader. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Running, Role: LEADER
I20251028 09:10:17.798000 10367 consensus_queue.cc:237] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 289, Committed index: 289, Last appended: 1.292, Last appended by leader: 292, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:17.798703  9476 catalog_manager.cc:5649] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 reported cstate change: term changed from 1 to 3. New cstate: current_term: 3 leader_uuid: "a10940ed9d79478faf0dac2c7b960184" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } health_report { overall_health: UNKNOWN } } }
I20251028 09:10:17.816795 10523 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 1/1 log segments. Stats: ops{read=289 overwritten=0 applied=289 ignored=0} inserts{seen=14400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:17.817201 10523 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap complete.
I20251028 09:10:17.818580 10523 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent bootstrapping tablet: real 0.091s	user 0.060s	sys 0.024s
I20251028 09:10:17.819243 10523 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 2 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:17.819535 10523 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fedf3ecbec146b6a6ba544511a16fa1, State: Initialized, Role: FOLLOWER
I20251028 09:10:17.819684 10523 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 289, Last appended: 1.289, Last appended by leader: 289, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:17.819933 10523 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.004s
I20251028 09:10:17.819989 10631 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
W20251028 09:10:17.852974 10635 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:17.853147 10635 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:17.853169 10635 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:17.854677 10635 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:17.854739 10635 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.131
I20251028 09:10:17.856292 10635 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.131:39559
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.8.22.131
--webserver_port=36027
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.10635
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.131
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:17.856523 10635 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:17.856760 10635 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:17.859326 10651 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:17.859359 10653 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:17.859372 10650 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:17.859618 10635 server_base.cc:1047] running on GCE node
I20251028 09:10:17.859799 10635 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:17.860046 10635 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:17.861202 10635 hybrid_clock.cc:648] HybridClock initialized: now 1761642617861174 us; error 39 us; skew 500 ppm
I20251028 09:10:17.862392 10635 webserver.cc:492] Webserver started at http://127.8.22.131:36027/ using document root <none> and password file <none>
I20251028 09:10:17.862602 10635 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:17.862668 10635 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:17.863958 10635 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:17.864643 10659 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:17.864837 10635 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:17.864910 10635 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
uuid: "bdeedf35d61a49bda10c70688541a0ea"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:17.865175 10635 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:17.877178 10450 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 3 FOLLOWER]: Refusing update from remote peer a10940ed9d79478faf0dac2c7b960184: Log matching property violated. Preceding OpId in replica: term: 1 index: 292. Preceding OpId from leader: term: 3 index: 293. (index mismatch)
I20251028 09:10:17.877509 10367 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Connected to new peer: Peer: permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 293, Last known committed idx: 289, Time since last communication: 0.000s
I20251028 09:10:17.879182 10585 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 2 FOLLOWER]: Advancing to term 3
I20251028 09:10:17.880944 10585 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 3 FOLLOWER]: Refusing update from remote peer a10940ed9d79478faf0dac2c7b960184: Log matching property violated. Preceding OpId in replica: term: 1 index: 289. Preceding OpId from leader: term: 3 index: 293. (index mismatch)
I20251028 09:10:17.881194 10367 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 293, Last known committed idx: 289, Time since last communication: 0.000s
I20251028 09:10:17.888470 10635 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:17.888761 10635 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:17.888885 10635 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:17.889118 10635 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:17.889724 10635 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:10:17.889771 10635 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:17.889808 10635 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:10:17.889834 10635 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:17.897517 10635 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.131:39559
I20251028 09:10:17.897938 10635 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
I20251028 09:10:17.898622 10787 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.131:39559 every 8 connection(s)
I20251028 09:10:17.906013  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 10635
I20251028 09:10:17.908731 10788 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:17.908861 10788 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:17.909144 10788 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:17.909595  9476 ts_manager.cc:194] Re-registered known tserver with Master: bdeedf35d61a49bda10c70688541a0ea (127.8.22.131:39559)
I20251028 09:10:17.910055  9476 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.131:54723
I20251028 09:10:18.104316 10559 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:18.109230 10430 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:18.112206 10292 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:10:18.129287 10722 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:18.425215  9474 ts_manager.cc:284] Unset tserver state for a10940ed9d79478faf0dac2c7b960184 from MAINTENANCE_MODE
W20251028 09:10:18.484608 10101 scanner-internal.cc:458] Time spent opening tablet: real 2.354s	user 0.001s	sys 0.000s
W20251028 09:10:18.485142 10100 scanner-internal.cc:458] Time spent opening tablet: real 2.355s	user 0.001s	sys 0.000s
W20251028 09:10:18.485582 10102 scanner-internal.cc:458] Time spent opening tablet: real 2.349s	user 0.001s	sys 0.000s
I20251028 09:10:18.526166  9474 ts_manager.cc:284] Unset tserver state for 817e1b13d456460eb9915890dd578911 from MAINTENANCE_MODE
I20251028 09:10:18.588758  9474 ts_manager.cc:284] Unset tserver state for bdeedf35d61a49bda10c70688541a0ea from MAINTENANCE_MODE
I20251028 09:10:18.610412  9474 ts_manager.cc:284] Unset tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 from MAINTENANCE_MODE
I20251028 09:10:18.880079 10496 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:18.884815 10631 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:18.888536 10358 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:18.924981 10788 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:18.987031  9474 ts_manager.cc:295] Set tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 to MAINTENANCE_MODE
I20251028 09:10:19.137017  9474 ts_manager.cc:295] Set tserver state for a10940ed9d79478faf0dac2c7b960184 to MAINTENANCE_MODE
I20251028 09:10:19.147989  9474 ts_manager.cc:295] Set tserver state for bdeedf35d61a49bda10c70688541a0ea to MAINTENANCE_MODE
I20251028 09:10:19.149194  9474 ts_manager.cc:295] Set tserver state for 817e1b13d456460eb9915890dd578911 to MAINTENANCE_MODE
I20251028 09:10:19.435336 10559 tablet_service.cc:1460] Tablet server 3fedf3ecbec146b6a6ba544511a16fa1 set to quiescing
I20251028 09:10:19.435415 10559 tablet_service.cc:1467] Tablet server has 0 leaders and 2 scanners
I20251028 09:10:19.500296 10292 tablet_service.cc:1460] Tablet server a10940ed9d79478faf0dac2c7b960184 set to quiescing
I20251028 09:10:19.500371 10292 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:10:19.508863 10831 raft_consensus.cc:993] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: : Instructing follower 817e1b13d456460eb9915890dd578911 to start an election
I20251028 09:10:19.508946 10831 raft_consensus.cc:1081] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 3 LEADER]: Signalling peer 817e1b13d456460eb9915890dd578911 to start an election
I20251028 09:10:19.509160 10449 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2"
dest_uuid: "817e1b13d456460eb9915890dd578911"
 from {username='slave'} at 127.8.22.130:54457
I20251028 09:10:19.509290 10449 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 3 FOLLOWER]: Starting forced leader election (received explicit request)
I20251028 09:10:19.509326 10449 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 3 FOLLOWER]: Advancing to term 4
I20251028 09:10:19.510219 10449 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 4 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:19.510545 10449 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 4 election: Requested vote from peers a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:19.511181 10449 raft_consensus.cc:1240] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 4 FOLLOWER]: Rejecting Update request from peer a10940ed9d79478faf0dac2c7b960184 for earlier term 3. Current term is 4. Ops: [3.1490-3.1491]
I20251028 09:10:19.511734 10367 consensus_queue.cc:1059] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 }, Status: INVALID_TERM, Last received: 3.1489, Next index: 1490, Last known committed idx: 1488, Time since last communication: 0.000s
I20251028 09:10:19.511857 10367 raft_consensus.cc:3055] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 3 LEADER]: Stepping down as leader of term 3
I20251028 09:10:19.511888 10367 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 3 LEADER]: Becoming Follower/Learner. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Running, Role: LEADER
I20251028 09:10:19.511934 10367 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 1489, Committed index: 1489, Last appended: 3.1491, Last appended by leader: 1491, Current term: 3, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:19.512022 10367 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 3 FOLLOWER]: Advancing to term 4
W20251028 09:10:19.512904 10366 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: Replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config. Role: FOLLOWER. Consensus state: current_term: 4 committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } } }
I20251028 09:10:19.515403 10312 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "817e1b13d456460eb9915890dd578911" candidate_term: 4 candidate_status { last_received { term: 3 index: 1489 } } ignore_live_leader: true dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
I20251028 09:10:19.515554 10312 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 4 FOLLOWER]: Leader election vote request: Denying vote to candidate 817e1b13d456460eb9915890dd578911 for term 4 because replica has last-logged OpId of term: 3 index: 1491, which is greater than that of the candidate, which has last-logged OpId of term: 3 index: 1489.
I20251028 09:10:19.517611 10584 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "817e1b13d456460eb9915890dd578911" candidate_term: 4 candidate_status { last_received { term: 3 index: 1489 } } ignore_live_leader: true dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
I20251028 09:10:19.517709 10584 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 3 FOLLOWER]: Advancing to term 4
I20251028 09:10:19.518242 10430 tablet_service.cc:1460] Tablet server 817e1b13d456460eb9915890dd578911 set to quiescing
I20251028 09:10:19.518307 10430 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:19.518399 10722 tablet_service.cc:1460] Tablet server bdeedf35d61a49bda10c70688541a0ea set to quiescing
I20251028 09:10:19.518443 10722 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:19.518674 10584 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 4 FOLLOWER]: Leader election vote request: Denying vote to candidate 817e1b13d456460eb9915890dd578911 for term 4 because replica has last-logged OpId of term: 3 index: 1491, which is greater than that of the candidate, which has last-logged OpId of term: 3 index: 1489.
I20251028 09:10:19.518888 10385 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 4 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 817e1b13d456460eb9915890dd578911; no voters: 3fedf3ecbec146b6a6ba544511a16fa1, a10940ed9d79478faf0dac2c7b960184
I20251028 09:10:19.519481 10969 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 4 FOLLOWER]: Leader election lost for term 4. Reason: could not achieve majority
W20251028 09:10:19.521979 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.527911 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.533388 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.538216 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.543995 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.552673 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.561578 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.573410 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.587808 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.598609 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.611578 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.628613 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.645519 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.663414 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.681870 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.698868 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.717865 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.740181 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.761036 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.781994 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.804562 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.814452 10970 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:19.827455 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.832792 10924 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:19.852912 10969 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:19.853348 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.879827 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.905748 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.935623 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.965267 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:19.997116 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.031445 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.064795 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.099771 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.132776 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.167246 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.202265 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.238140 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.278538 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.319460 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.361275 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.402839 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.447755 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.492726 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.536377 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.584254 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:20.586339 10559 tablet_service.cc:1460] Tablet server 3fedf3ecbec146b6a6ba544511a16fa1 set to quiescing
I20251028 09:10:20.586406 10559 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251028 09:10:20.632195 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:20.651412 10292 tablet_service.cc:1460] Tablet server a10940ed9d79478faf0dac2c7b960184 set to quiescing
I20251028 09:10:20.651481 10292 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251028 09:10:20.681582 10270 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46422: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:20.707434  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 10229
W20251028 09:10:20.713227 10072 meta_cache.cc:302] tablet cd9b9f67e7d142db81a1c8be59070ef2: replica a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397) has failed: Network error: recv got EOF from 127.8.22.130:41397 (error 108)
I20251028 09:10:20.713567  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.130:41397
--local_ip_for_outbound_sockets=127.8.22.130
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36451
--webserver_interface=127.8.22.130
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251028 09:10:20.715799 10410 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.715827 10409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.731129 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.781620 10409 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57830: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:20.796563 10993 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:20.796743 10993 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:20.796765 10993 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:20.798377 10993 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:20.798437 10993 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.130
I20251028 09:10:20.800132 10993 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.130:41397
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.8.22.130
--webserver_port=36451
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.10993
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.130
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:20.800376 10993 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:20.800643 10993 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:20.803396 10998 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:20.803395 11001 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:20.803531 10993 server_base.cc:1047] running on GCE node
W20251028 09:10:20.803396 10999 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:20.803751 10993 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:20.803946 10993 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:20.805095 10993 hybrid_clock.cc:648] HybridClock initialized: now 1761642620805080 us; error 26 us; skew 500 ppm
I20251028 09:10:20.806221 10993 webserver.cc:492] Webserver started at http://127.8.22.130:36451/ using document root <none> and password file <none>
I20251028 09:10:20.806411 10993 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:20.806457 10993 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:20.807727 10993 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:20.808431 11007 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:20.808655 10993 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:20.808730 10993 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
uuid: "a10940ed9d79478faf0dac2c7b960184"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:20.809038 10993 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:20.824219 10993 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:20.824498 10993 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:20.824617 10993 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:20.824863 10993 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:20.825330 11014 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:20.826179 10993 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:20.826227 10993 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:20.826265 10993 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:20.826813 10993 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:20.826845 10993 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:20.826934 11014 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap starting.
I20251028 09:10:20.832834 10993 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.130:41397
I20251028 09:10:20.832893 11121 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.130:41397 every 8 connection(s)
I20251028 09:10:20.833166 10993 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
I20251028 09:10:20.837807 11122 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:20.837920 11122 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:20.838138 11122 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:20.838712  9474 ts_manager.cc:194] Re-registered known tserver with Master: a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:20.838742  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 10993
I20251028 09:10:20.838827  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 10363
I20251028 09:10:20.839871  9474 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.130:48365
I20251028 09:10:20.845582  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.129:39271
--local_ip_for_outbound_sockets=127.8.22.129
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=38771
--webserver_interface=127.8.22.129
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:20.869889 11014 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Log is configured to *not* fsync() on all Append() calls
W20251028 09:10:20.927826 11128 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:20.927999 11128 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:20.928020 11128 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:20.929493 11128 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:20.929549 11128 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.129
I20251028 09:10:20.931087 11128 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.129:39271
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.8.22.129
--webserver_port=38771
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.11128
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.129
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:20.931313 11128 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:20.931529 11128 file_cache.cc:492] Constructed file cache file cache with capacity 419430
I20251028 09:10:20.934172 11128 server_base.cc:1047] running on GCE node
W20251028 09:10:20.934149 11137 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:20.934157 11134 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:20.934304 11135 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:20.934713 11128 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:20.934938 11128 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:20.936110 11128 hybrid_clock.cc:648] HybridClock initialized: now 1761642620936095 us; error 33 us; skew 500 ppm
I20251028 09:10:20.937290 11128 webserver.cc:492] Webserver started at http://127.8.22.129:38771/ using document root <none> and password file <none>
I20251028 09:10:20.937505 11128 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:20.937551 11128 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:20.938716 11128 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:20.939379 11143 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:20.939596 11128 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:20.939683 11128 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
uuid: "817e1b13d456460eb9915890dd578911"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:20.940039 11128 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:20.950080 11128 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:20.950337 11128 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:20.950460 11128 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:20.950665 11128 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:20.951201 11150 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:20.952329 11128 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:20.952371 11128 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:20.952407 11128 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
W20251028 09:10:20.952684 10543 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40450: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:20.952943 11128 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:20.952989 11128 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:20.953029 11150 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap starting.
I20251028 09:10:20.960484 11128 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.129:39271
I20251028 09:10:20.960546 11257 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.129:39271 every 8 connection(s)
I20251028 09:10:20.960953 11128 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
I20251028 09:10:20.967978 11258 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:20.968094 11258 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:20.968334 11258 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:20.968920  9474 ts_manager.cc:194] Re-registered known tserver with Master: 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271)
I20251028 09:10:20.969468  9474 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.129:37783
I20251028 09:10:20.970351  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 11128
I20251028 09:10:20.970477  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 10500
I20251028 09:10:20.977131  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.132:43959
--local_ip_for_outbound_sockets=127.8.22.132
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36911
--webserver_interface=127.8.22.132
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:21.006786 11150 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Log is configured to *not* fsync() on all Append() calls
W20251028 09:10:21.064196 11261 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:21.064379 11261 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:21.064414 11261 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:21.066074 11261 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:21.066165 11261 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.132
I20251028 09:10:21.068011 11261 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.132:43959
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.8.22.132
--webserver_port=36911
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.11261
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.132
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:21.068279 11261 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:21.068512 11261 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:21.071120 11268 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:21.071213 11271 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:21.071121 11269 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:21.072098 11261 server_base.cc:1047] running on GCE node
I20251028 09:10:21.072338 11261 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:21.072546 11261 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:21.075814 11261 hybrid_clock.cc:648] HybridClock initialized: now 1761642621075795 us; error 35 us; skew 500 ppm
I20251028 09:10:21.077093 11261 webserver.cc:492] Webserver started at http://127.8.22.132:36911/ using document root <none> and password file <none>
I20251028 09:10:21.077313 11261 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:21.077371 11261 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:21.083384 11261 fs_manager.cc:714] Time spent opening directory manager: real 0.005s	user 0.001s	sys 0.000s
I20251028 09:10:21.084177 11277 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:21.084405 11261 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:21.084491 11261 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:21.084786 11261 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:21.102272 11261 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:21.102540 11261 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:21.102665 11261 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:21.102909 11261 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:21.103438 11284 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:21.104276 11261 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:21.104327 11261 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:21.104368 11261 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:21.104879 11261 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:21.104926 11261 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:21.105069 11284 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap starting.
I20251028 09:10:21.112612 11261 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.132:43959
I20251028 09:10:21.113015 11261 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
I20251028 09:10:21.115286 11391 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.132:43959 every 8 connection(s)
I20251028 09:10:21.122561  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 11261
I20251028 09:10:21.122674  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 10635
I20251028 09:10:21.130506  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.131:39559
--local_ip_for_outbound_sockets=127.8.22.131
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36027
--webserver_interface=127.8.22.131
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:21.133725 11392 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:21.133836 11392 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:21.134083 11392 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:21.134603  9474 ts_manager.cc:194] Re-registered known tserver with Master: 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:21.135181  9474 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.132:46097
I20251028 09:10:21.165689 11284 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:21.209921 11014 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 1/1 log segments. Stats: ops{read=1491 overwritten=0 applied=1489 ignored=0} inserts{seen=74350 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:21.210355 11014 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap complete.
I20251028 09:10:21.212071 11014 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent bootstrapping tablet: real 0.385s	user 0.306s	sys 0.067s
I20251028 09:10:21.213796 11014 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 4 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:21.214600 11014 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Initialized, Role: FOLLOWER
I20251028 09:10:21.214748 11014 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1489, Last appended: 3.1491, Last appended by leader: 1491, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:21.215003 11014 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent starting tablet: real 0.003s	user 0.001s	sys 0.003s
I20251028 09:10:21.215122 11122 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
W20251028 09:10:21.233469 11035 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:21.259734 11395 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:21.259968 11395 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:21.260004 11395 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:21.262269 11395 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:21.262352 11395 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.131
I20251028 09:10:21.264794 11395 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.131:39559
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.8.22.131
--webserver_port=36027
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.11395
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.131
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:21.265033 11395 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:21.265306 11395 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:21.268208 11404 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:21.268246 11407 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:21.268208 11405 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:21.269315 11395 server_base.cc:1047] running on GCE node
I20251028 09:10:21.269505 11395 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:21.269752 11395 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:21.270920 11395 hybrid_clock.cc:648] HybridClock initialized: now 1761642621270896 us; error 42 us; skew 500 ppm
I20251028 09:10:21.272410 11395 webserver.cc:492] Webserver started at http://127.8.22.131:36027/ using document root <none> and password file <none>
I20251028 09:10:21.272655 11395 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:21.272715 11395 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:21.274386 11395 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.002s	sys 0.000s
I20251028 09:10:21.275177 11413 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:21.275429 11395 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:21.275540 11395 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
uuid: "bdeedf35d61a49bda10c70688541a0ea"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:21.275874 11395 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:21.304948 11395 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:21.305231 11395 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:21.305356 11395 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:21.305612 11395 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:21.305984 11395 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:10:21.306030 11395 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:21.306066 11395 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:10:21.306090 11395 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:21.313122 11395 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.131:39559
I20251028 09:10:21.313359 11526 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.131:39559 every 8 connection(s)
I20251028 09:10:21.313524 11395 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
I20251028 09:10:21.321154  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 11395
I20251028 09:10:21.324831 11527 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:21.324992 11527 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:21.325302 11527 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:21.325754  9474 ts_manager.cc:194] Re-registered known tserver with Master: bdeedf35d61a49bda10c70688541a0ea (127.8.22.131:39559)
I20251028 09:10:21.326253  9474 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.131:48777
I20251028 09:10:21.331964 11150 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 1/1 log segments. Stats: ops{read=1489 overwritten=0 applied=1488 ignored=0} inserts{seen=74300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251028 09:10:21.332396 11150 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap complete.
I20251028 09:10:21.334101 11150 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent bootstrapping tablet: real 0.381s	user 0.300s	sys 0.070s
I20251028 09:10:21.335939 11150 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 4 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:21.336766 11150 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: 817e1b13d456460eb9915890dd578911, State: Initialized, Role: FOLLOWER
I20251028 09:10:21.336930 11150 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1488, Last appended: 3.1489, Last appended by leader: 1489, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:21.337167 11150 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent starting tablet: real 0.003s	user 0.006s	sys 0.000s
I20251028 09:10:21.337169 11258 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
W20251028 09:10:21.415616 11035 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:21.441089 11326 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:21.441569 11174 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:21.447003 11046 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:21.452226 11461 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:21.466702 11284 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 1/1 log segments. Stats: ops{read=1491 overwritten=0 applied=1489 ignored=0} inserts{seen=74350 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:21.467077 11284 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap complete.
I20251028 09:10:21.468343 11284 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent bootstrapping tablet: real 0.363s	user 0.295s	sys 0.052s
I20251028 09:10:21.469344 11284 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 4 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:21.469960 11284 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fedf3ecbec146b6a6ba544511a16fa1, State: Initialized, Role: FOLLOWER
I20251028 09:10:21.470110 11284 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1489, Last appended: 3.1491, Last appended by leader: 1491, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:21.470321 11284 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent starting tablet: real 0.002s	user 0.003s	sys 0.000s
I20251028 09:10:21.470372 11392 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
W20251028 09:10:21.478456 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:21.512192 11399 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 4 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:21.512326 11399 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 4 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:21.512640 11399 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 5 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:21.516355 11346 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 5 candidate_status { last_received { term: 3 index: 1491 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" is_pre_election: true
I20251028 09:10:21.516377 11212 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 5 candidate_status { last_received { term: 3 index: 1491 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
I20251028 09:10:21.516512 11212 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 4 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 4.
I20251028 09:10:21.516520 11346 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 4 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 4.
I20251028 09:10:21.516703 11011 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 5 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1, a10940ed9d79478faf0dac2c7b960184; no voters: 
I20251028 09:10:21.516850 11399 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 4 FOLLOWER]: Leader pre-election won for term 5
I20251028 09:10:21.516924 11399 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 4 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:10:21.516963 11399 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 4 FOLLOWER]: Advancing to term 5
I20251028 09:10:21.517882 11399 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 5 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:21.518023 11399 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 5 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:21.518184 11346 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 5 candidate_status { last_received { term: 3 index: 1491 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
I20251028 09:10:21.518190 11212 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 5 candidate_status { last_received { term: 3 index: 1491 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:21.518247 11346 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 4 FOLLOWER]: Advancing to term 5
I20251028 09:10:21.518254 11212 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 4 FOLLOWER]: Advancing to term 5
I20251028 09:10:21.519137 11346 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 5 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 5.
I20251028 09:10:21.519147 11212 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 5 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 5.
I20251028 09:10:21.519289 11008 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 5 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184; no voters: 
I20251028 09:10:21.519395 11399 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 5 FOLLOWER]: Leader election won for term 5
I20251028 09:10:21.519490 11399 raft_consensus.cc:697] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 5 LEADER]: Becoming Leader. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Running, Role: LEADER
I20251028 09:10:21.519577 11399 consensus_queue.cc:237] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1489, Committed index: 1489, Last appended: 3.1491, Last appended by leader: 1491, Current term: 5, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:21.520224  9474 catalog_manager.cc:5649] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 reported cstate change: term changed from 3 to 5. New cstate: current_term: 5 leader_uuid: "a10940ed9d79478faf0dac2c7b960184" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } health_report { overall_health: UNKNOWN } } }
W20251028 09:10:21.541553 11297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:21.603606 11212 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 5 FOLLOWER]: Refusing update from remote peer a10940ed9d79478faf0dac2c7b960184: Log matching property violated. Preceding OpId in replica: term: 3 index: 1489. Preceding OpId from leader: term: 5 index: 1492. (index mismatch)
I20251028 09:10:21.603991 11399 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Connected to new peer: Peer: permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1492, Last known committed idx: 1488, Time since last communication: 0.000s
I20251028 09:10:21.605484 11346 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 5 FOLLOWER]: Refusing update from remote peer a10940ed9d79478faf0dac2c7b960184: Log matching property violated. Preceding OpId in replica: term: 3 index: 1491. Preceding OpId from leader: term: 5 index: 1493. (index mismatch)
I20251028 09:10:21.605773 11399 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1492, Last known committed idx: 1489, Time since last communication: 0.000s
W20251028 09:10:21.851936 10101 scanner-internal.cc:458] Time spent opening tablet: real 2.306s	user 0.001s	sys 0.001s
W20251028 09:10:21.910817 10102 scanner-internal.cc:458] Time spent opening tablet: real 2.305s	user 0.001s	sys 0.000s
W20251028 09:10:21.925981 10100 scanner-internal.cc:458] Time spent opening tablet: real 2.305s	user 0.000s	sys 0.001s
I20251028 09:10:22.327170 11527 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:26.719672 11326 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20251028 09:10:26.723961 11461 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:26.724079 11174 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:26.741209 11046 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:10:27.147897  9476 ts_manager.cc:284] Unset tserver state for 817e1b13d456460eb9915890dd578911 from MAINTENANCE_MODE
I20251028 09:10:27.151789  9476 ts_manager.cc:284] Unset tserver state for a10940ed9d79478faf0dac2c7b960184 from MAINTENANCE_MODE
I20251028 09:10:27.160820  9476 ts_manager.cc:284] Unset tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 from MAINTENANCE_MODE
I20251028 09:10:27.183137  9476 ts_manager.cc:284] Unset tserver state for bdeedf35d61a49bda10c70688541a0ea from MAINTENANCE_MODE
I20251028 09:10:27.332396 11527 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:27.553632  9476 ts_manager.cc:295] Set tserver state for bdeedf35d61a49bda10c70688541a0ea to MAINTENANCE_MODE
I20251028 09:10:27.611327 11392 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:27.611742 11258 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:27.612147 11122 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:27.655119  9476 ts_manager.cc:295] Set tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 to MAINTENANCE_MODE
I20251028 09:10:27.711236  9476 ts_manager.cc:295] Set tserver state for 817e1b13d456460eb9915890dd578911 to MAINTENANCE_MODE
I20251028 09:10:27.730715  9476 ts_manager.cc:295] Set tserver state for a10940ed9d79478faf0dac2c7b960184 to MAINTENANCE_MODE
I20251028 09:10:28.021522 11461 tablet_service.cc:1460] Tablet server bdeedf35d61a49bda10c70688541a0ea set to quiescing
I20251028 09:10:28.021615 11461 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:28.103896 11174 tablet_service.cc:1460] Tablet server 817e1b13d456460eb9915890dd578911 set to quiescing
I20251028 09:10:28.103963 11174 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:28.112485 11046 tablet_service.cc:1460] Tablet server a10940ed9d79478faf0dac2c7b960184 set to quiescing
I20251028 09:10:28.112555 11046 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:10:28.123853 11597 raft_consensus.cc:993] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: : Instructing follower 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:28.123942 11597 raft_consensus.cc:1081] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 5 LEADER]: Signalling peer 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:28.124162 11346 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2"
dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
 from {username='slave'} at 127.8.22.130:43387
I20251028 09:10:28.124257 11346 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 5 FOLLOWER]: Starting forced leader election (received explicit request)
I20251028 09:10:28.124284 11346 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 5 FOLLOWER]: Advancing to term 6
I20251028 09:10:28.125123 11346 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 6 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:28.125413 11346 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 6 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:28.128026 11326 tablet_service.cc:1460] Tablet server 3fedf3ecbec146b6a6ba544511a16fa1 set to quiescing
I20251028 09:10:28.128077 11326 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20251028 09:10:28.129599 11346 raft_consensus.cc:1240] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 6 FOLLOWER]: Rejecting Update request from peer a10940ed9d79478faf0dac2c7b960184 for earlier term 5. Current term is 6. Ops: [5.6965-5.6966]
I20251028 09:10:28.131068 11076 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 6 candidate_status { last_received { term: 5 index: 6964 } } ignore_live_leader: true dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
I20251028 09:10:28.131192 11076 raft_consensus.cc:3055] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 5 LEADER]: Stepping down as leader of term 5
I20251028 09:10:28.131232 11076 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 5 LEADER]: Becoming Follower/Learner. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Running, Role: LEADER
I20251028 09:10:28.131283 11076 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 6967, Committed index: 6967, Last appended: 5.6967, Last appended by leader: 6967, Current term: 5, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
W20251028 09:10:28.131390 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:28.131500 11596 consensus_queue.cc:1059] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Peer responded invalid term: Peer: permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 }, Status: INVALID_TERM, Last received: 5.6964, Next index: 6965, Last known committed idx: 6964, Time since last communication: 0.000s
W20251028 09:10:28.131544 11034 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:28.131712 11076 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 5 FOLLOWER]: Advancing to term 6
W20251028 09:10:28.132334 11034 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:28.132409 11076 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 6 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 6 because replica has last-logged OpId of term: 5 index: 6967, which is greater than that of the candidate, which has last-logged OpId of term: 5 index: 6964.
W20251028 09:10:28.133816 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:28.135437 11212 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 6 candidate_status { last_received { term: 5 index: 6964 } } ignore_live_leader: true dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:28.135527 11212 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 5 FOLLOWER]: Advancing to term 6
W20251028 09:10:28.136355 11296 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:28.136474 11212 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 6 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 6 because replica has last-logged OpId of term: 5 index: 6967, which is greater than that of the candidate, which has last-logged OpId of term: 5 index: 6964.
I20251028 09:10:28.136737 11278 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 6 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1; no voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184
W20251028 09:10:28.136850 11295 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:28.136986 11779 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 6 FOLLOWER]: Leader election lost for term 6. Reason: could not achieve majority
W20251028 09:10:28.137192 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.142359 11034 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.142500 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.142665 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.147567 11034 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.150475 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.150687 11034 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.152148 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.157236 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.159804 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.161878 11298 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.162889 11296 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.167927 11034 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.171833 11034 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.172008 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.175571 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.180795 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.180796 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.186539 11298 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.189496 11295 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.193568 11298 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.200562 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.201308 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.203527 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.212155 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.214126 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.216231 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.225939 11295 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.227916 11298 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.229096 11295 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.240085 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.240784 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.245929 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.254582 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.256688 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.260757 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.269529 11298 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.272505 11298 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.276759 11296 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.289661 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.292721 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.294103 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.310755 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.310757 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.311753 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.330371 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.331511 11298 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.332455 11295 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.349400 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.353539 11034 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.353539 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.371371 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.373379 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.374480 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.394066 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.394080 11298 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.399180 11297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.402259 11599 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:28.417220 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.420192 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.425155 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.430922 11780 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:28.443810 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.444819 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.451916 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.470477 11297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.472586 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.479636 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.484004 11779 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:28.496490 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.501528 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.507550 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.526196 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.528213 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.535363 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.555076 11296 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.556981 11297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.564100 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.585170 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.588932 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.595007 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.617209 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.619215 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.628376 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.650152 11297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.652132 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.662325 11297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.685931 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.686880 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.698493 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.721347 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.723333 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.730667 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.756632 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.759583 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.766769 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.794402 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.797091 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.803407 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.833009 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.833009 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.842743 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.869644 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.870534 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.880810 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.907325 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.909235 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.920626 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.949537 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.950330 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.961666 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.988613 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:28.993726 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.004474 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.033787 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.036859 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.045174 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.078102 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.079056 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.089350 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.123304 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.124330 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.135613 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.169111 11034 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.169111 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.183403 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.213414 11171 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.213414 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.229871 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.259065 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.263087 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:29.264199 11046 tablet_service.cc:1460] Tablet server a10940ed9d79478faf0dac2c7b960184 set to quiescing
I20251028 09:10:29.264256 11046 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251028 09:10:29.278311 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:29.282545 11326 tablet_service.cc:1460] Tablet server 3fedf3ecbec146b6a6ba544511a16fa1 set to quiescing
I20251028 09:10:29.282605 11326 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251028 09:10:29.305857 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.312901 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.325143 11036 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57944: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:29.338526  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 10993
I20251028 09:10:29.347087  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.130:41397
--local_ip_for_outbound_sockets=127.8.22.130
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36451
--webserver_interface=127.8.22.130
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251028 09:10:29.355443 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.360497 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.374836 11172 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57846: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.405892 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.411036 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.423240 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.426352 11803 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:29.426542 11803 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:29.426579 11803 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:29.428048 11803 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:29.428114 11803 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.130
I20251028 09:10:29.429661 11803 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.130:41397
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.8.22.130
--webserver_port=36451
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.11803
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.130
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:29.429893 11803 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:29.430114 11803 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:29.432993 11811 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:29.433012 11808 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:29.433167 11803 server_base.cc:1047] running on GCE node
W20251028 09:10:29.432994 11809 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:29.433413 11803 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:29.433588 11803 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:29.434702 11803 hybrid_clock.cc:648] HybridClock initialized: now 1761642629434697 us; error 28 us; skew 500 ppm
I20251028 09:10:29.435846 11803 webserver.cc:492] Webserver started at http://127.8.22.130:36451/ using document root <none> and password file <none>
I20251028 09:10:29.436053 11803 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:29.436095 11803 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:29.437340 11803 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:29.438067 11817 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:29.438220 11803 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251028 09:10:29.438292 11803 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
uuid: "a10940ed9d79478faf0dac2c7b960184"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:29.438557 11803 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:29.448628 11803 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:29.448880 11803 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:29.448994 11803 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:29.449209 11803 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:29.449635 11824 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:29.450480 11803 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:29.450527 11803 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:29.450577 11803 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:29.451157 11803 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:29.451191 11803 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:29.451269 11824 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap starting.
W20251028 09:10:29.455818 10072 meta_cache.cc:302] tablet cd9b9f67e7d142db81a1c8be59070ef2: replica a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397) has failed: Network error: Client connection negotiation failed: client connection to 127.8.22.130:41397: connect: Connection refused (error 111)
I20251028 09:10:29.459534 11803 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.130:41397
I20251028 09:10:29.459621 11932 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.130:41397 every 8 connection(s)
I20251028 09:10:29.459910 11803 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
I20251028 09:10:29.461663  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 11803
I20251028 09:10:29.461755  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 11128
I20251028 09:10:29.468122 11933 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:29.468230 11933 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:29.468467 11933 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:29.469085  9476 ts_manager.cc:194] Re-registered known tserver with Master: a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:29.469650  9476 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.130:46621
I20251028 09:10:29.472398  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.129:39271
--local_ip_for_outbound_sockets=127.8.22.129
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=38771
--webserver_interface=127.8.22.129
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:29.503877 11824 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Log is configured to *not* fsync() on all Append() calls
W20251028 09:10:29.530668 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.553823 11938 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:29.554008 11938 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:29.554044 11938 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:29.555536 11938 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:29.555596 11938 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.129
I20251028 09:10:29.557154 11938 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.129:39271
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.8.22.129
--webserver_port=38771
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.11938
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.129
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:29.557380 11938 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:29.557602 11938 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:29.558766 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:29.560251 11947 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:29.560325 11945 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:29.560254 11944 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:29.560652 11938 server_base.cc:1047] running on GCE node
I20251028 09:10:29.560837 11938 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:29.561060 11938 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:29.562194 11938 hybrid_clock.cc:648] HybridClock initialized: now 1761642629562185 us; error 30 us; skew 500 ppm
I20251028 09:10:29.563373 11938 webserver.cc:492] Webserver started at http://127.8.22.129:38771/ using document root <none> and password file <none>
I20251028 09:10:29.563582 11938 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:29.563629 11938 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:29.564898 11938 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:29.565619 11953 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:29.565817 11938 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:29.565901 11938 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
uuid: "817e1b13d456460eb9915890dd578911"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:29.566171 11938 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251028 09:10:29.577106 11294 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40538: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:29.587581 11938 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:29.587836 11938 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:29.587981 11938 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:29.588196 11938 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:29.588688 11960 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:29.589835 11938 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:29.589887 11938 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:29.589928 11938 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:29.590449 11938 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:29.590480 11938 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:29.590566 11960 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap starting.
I20251028 09:10:29.597779 11938 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.129:39271
I20251028 09:10:29.598110 11938 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
I20251028 09:10:29.602432 12068 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:29.602525 12068 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:29.602705 12068 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:29.603210  9476 ts_manager.cc:194] Re-registered known tserver with Master: 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271)
I20251028 09:10:29.603586  9476 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.129:44481
I20251028 09:10:29.607183 12067 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.129:39271 every 8 connection(s)
I20251028 09:10:29.607774  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 11938
I20251028 09:10:29.607867  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 11261
I20251028 09:10:29.618463  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.132:43959
--local_ip_for_outbound_sockets=127.8.22.132
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36911
--webserver_interface=127.8.22.132
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:29.659497 11960 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Log is configured to *not* fsync() on all Append() calls
W20251028 09:10:29.703629 12072 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:29.703804 12072 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:29.703836 12072 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:29.705329 12072 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:29.705397 12072 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.132
I20251028 09:10:29.706934 12072 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.132:43959
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.8.22.132
--webserver_port=36911
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.12072
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.132
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:29.707201 12072 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:29.707433 12072 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:29.709832 12078 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:29.709880 12079 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:29.710444 12081 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:29.710541 12072 server_base.cc:1047] running on GCE node
I20251028 09:10:29.710783 12072 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:29.711036 12072 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:29.712203 12072 hybrid_clock.cc:648] HybridClock initialized: now 1761642629712183 us; error 24 us; skew 500 ppm
I20251028 09:10:29.713362 12072 webserver.cc:492] Webserver started at http://127.8.22.132:36911/ using document root <none> and password file <none>
I20251028 09:10:29.713553 12072 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:29.713599 12072 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:29.714972 12072 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:29.715629 12087 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:29.715807 12072 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20251028 09:10:29.715879 12072 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:29.716132 12072 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:29.735726 12072 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:29.735982 12072 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:29.736104 12072 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:29.736482 12072 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:29.736984 12094 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:29.737838 12072 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:29.737887 12072 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:29.737926 12072 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:29.738437 12072 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:29.738480 12072 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:29.738617 12094 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap starting.
I20251028 09:10:29.745370 12072 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.132:43959
I20251028 09:10:29.745782 12072 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
I20251028 09:10:29.746860 12201 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.132:43959 every 8 connection(s)
I20251028 09:10:29.755123  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 12072
I20251028 09:10:29.755231  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 11395
I20251028 09:10:29.762804  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.131:39559
--local_ip_for_outbound_sockets=127.8.22.131
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36027
--webserver_interface=127.8.22.131
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:29.765760 12202 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:29.765890 12202 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:29.766137 12202 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:29.766672  9476 ts_manager.cc:194] Re-registered known tserver with Master: 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:29.767163  9476 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.132:36827
I20251028 09:10:29.811224 12094 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Log is configured to *not* fsync() on all Append() calls
W20251028 09:10:29.907891 12205 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:29.908202 12205 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:29.908300 12205 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:29.910907 12205 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:29.911161 12205 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.131
I20251028 09:10:29.913698 12205 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.131:39559
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.8.22.131
--webserver_port=36027
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.12205
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.131
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:29.914144 12205 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:29.914461 12205 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:29.917708 12213 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:29.917737 12215 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:29.918088 12212 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:29.918931 12205 server_base.cc:1047] running on GCE node
I20251028 09:10:29.919147 12205 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:29.919379 12205 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:29.920537 12205 hybrid_clock.cc:648] HybridClock initialized: now 1761642629920522 us; error 28 us; skew 500 ppm
I20251028 09:10:29.921916 12205 webserver.cc:492] Webserver started at http://127.8.22.131:36027/ using document root <none> and password file <none>
I20251028 09:10:29.922143 12205 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:29.922191 12205 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:29.923885 12205 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:29.924613 12221 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:29.924808 12205 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:29.924875 12205 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
uuid: "bdeedf35d61a49bda10c70688541a0ea"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:29.925190 12205 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:29.940654 12205 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:29.941033 12205 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:29.941363 12205 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:29.941622 12205 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:29.942138 12205 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:10:29.942224 12205 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:29.942299 12205 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:10:29.942487 12205 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:29.949430 12205 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.131:39559
I20251028 09:10:29.949952 12205 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
I20251028 09:10:29.950757 12334 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.131:39559 every 8 connection(s)
I20251028 09:10:29.957731 12335 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:29.957902 12335 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:29.958174 12335 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:29.958609  9476 ts_manager.cc:194] Re-registered known tserver with Master: bdeedf35d61a49bda10c70688541a0ea (127.8.22.131:39559)
I20251028 09:10:29.959143  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 12205
I20251028 09:10:29.961285  9476 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.131:60357
I20251028 09:10:30.132807 12136 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:30.139079 11984 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:30.153335 11866 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:30.161005 12260 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:30.470595 11933 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:30.604303 12068 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:30.726230 11824 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 1/2 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:30.768011 12202 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:30.773844 12094 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 1/2 log segments. Stats: ops{read=4771 overwritten=0 applied=4771 ignored=0} inserts{seen=238400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:30.955837 11960 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 1/2 log segments. Stats: ops{read=4797 overwritten=0 applied=4794 ignored=0} inserts{seen=239550 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251028 09:10:30.962240 12335 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:31.175048 12094 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 2/2 log segments. Stats: ops{read=6964 overwritten=0 applied=6964 ignored=0} inserts{seen=348050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:31.175487 12094 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap complete.
I20251028 09:10:31.178457 12094 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent bootstrapping tablet: real 1.440s	user 1.199s	sys 0.192s
I20251028 09:10:31.179593 12094 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 6 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:31.179792 12094 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 6 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fedf3ecbec146b6a6ba544511a16fa1, State: Initialized, Role: FOLLOWER
I20251028 09:10:31.179894 12094 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6964, Last appended: 5.6964, Last appended by leader: 6964, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:31.180158 12094 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent starting tablet: real 0.002s	user 0.005s	sys 0.000s
W20251028 09:10:31.221213 10102 scanner-internal.cc:458] Time spent opening tablet: real 2.406s	user 0.001s	sys 0.000s
W20251028 09:10:31.237844 10100 scanner-internal.cc:458] Time spent opening tablet: real 2.406s	user 0.001s	sys 0.001s
W20251028 09:10:31.239465 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:31.240082 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:31.244819 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:31.399561 11824 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 2/2 log segments. Stats: ops{read=6967 overwritten=0 applied=6967 ignored=0} inserts{seen=348200 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:31.400161 11824 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap complete.
I20251028 09:10:31.404466 11824 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent bootstrapping tablet: real 1.953s	user 1.663s	sys 0.255s
I20251028 09:10:31.406152 11824 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 6 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:31.406517 11824 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 6 FOLLOWER]: Becoming Follower/Learner. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Initialized, Role: FOLLOWER
I20251028 09:10:31.406682 11824 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6967, Last appended: 5.6967, Last appended by leader: 6967, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:31.407025 11824 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251028 09:10:31.474589 12373 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 6 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:31.474701 12373 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 6 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:31.475003 12373 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 7 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
W20251028 09:10:31.475595 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:31.481436 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:31.482939 12022 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 7 candidate_status { last_received { term: 5 index: 6964 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
W20251028 09:10:31.484629 12088 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 7 pre-election: Tablet error from VoteRequest() call to peer 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271): Illegal state: must be running to vote when last-logged opid is not known
W20251028 09:10:31.484998 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:31.487994 11887 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 7 candidate_status { last_received { term: 5 index: 6964 } } ignore_live_leader: false dest_uuid: "a10940ed9d79478faf0dac2c7b960184" is_pre_election: true
I20251028 09:10:31.488127 11887 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 6 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 7 because replica has last-logged OpId of term: 5 index: 6967, which is greater than that of the candidate, which has last-logged OpId of term: 5 index: 6964.
I20251028 09:10:31.488453 12089 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 7 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1; no voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184
I20251028 09:10:31.488617 12373 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 6 FOLLOWER]: Leader pre-election lost for term 7. Reason: could not achieve majority
W20251028 09:10:31.556249 11846 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:31.563706 11846 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:31.609848 11960 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 2/2 log segments. Stats: ops{read=6967 overwritten=0 applied=6967 ignored=0} inserts{seen=348200 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:31.610338 11960 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap complete.
I20251028 09:10:31.614605 11960 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent bootstrapping tablet: real 2.024s	user 1.731s	sys 0.254s
I20251028 09:10:31.615836 11960 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 6 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:31.616146 11960 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 6 FOLLOWER]: Becoming Follower/Learner. State: Replica: 817e1b13d456460eb9915890dd578911, State: Initialized, Role: FOLLOWER
I20251028 09:10:31.616266 11960 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6967, Last appended: 5.6967, Last appended by leader: 6967, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:31.616503 11960 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
W20251028 09:10:31.640074 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:31.645539 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:31.651650 11846 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:31.684020 12375 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 6 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:31.684195 12375 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 6 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:31.684540 12375 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 7 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:31.688474 12022 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 7 candidate_status { last_received { term: 5 index: 6967 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
I20251028 09:10:31.688474 12156 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 7 candidate_status { last_received { term: 5 index: 6967 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" is_pre_election: true
I20251028 09:10:31.688583 12156 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 6 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 6.
I20251028 09:10:31.688612 12022 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 6 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 6.
I20251028 09:10:31.688766 11821 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 7 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1, a10940ed9d79478faf0dac2c7b960184; no voters: 
I20251028 09:10:31.688915 12375 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 6 FOLLOWER]: Leader pre-election won for term 7
I20251028 09:10:31.688972 12375 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 6 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:10:31.689004 12375 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 6 FOLLOWER]: Advancing to term 7
I20251028 09:10:31.690577 12375 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 7 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:31.690727 12375 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 7 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:31.690843 12156 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 7 candidate_status { last_received { term: 5 index: 6967 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
I20251028 09:10:31.690933 12156 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 6 FOLLOWER]: Advancing to term 7
I20251028 09:10:31.691006 12022 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 7 candidate_status { last_received { term: 5 index: 6967 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:31.691083 12022 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 6 FOLLOWER]: Advancing to term 7
I20251028 09:10:31.692135 12156 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 7 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 7.
I20251028 09:10:31.692277 11821 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 7 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1, a10940ed9d79478faf0dac2c7b960184; no voters: 
I20251028 09:10:31.692355 12375 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 7 FOLLOWER]: Leader election won for term 7
I20251028 09:10:31.692464 12022 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 7 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 7.
I20251028 09:10:31.692497 12375 raft_consensus.cc:697] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 7 LEADER]: Becoming Leader. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Running, Role: LEADER
I20251028 09:10:31.692595 12375 consensus_queue.cc:237] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6967, Committed index: 6967, Last appended: 5.6967, Last appended by leader: 6967, Current term: 7, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:31.693367  9476 catalog_manager.cc:5649] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 reported cstate change: term changed from 5 to 7. New cstate: current_term: 7 leader_uuid: "a10940ed9d79478faf0dac2c7b960184" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } health_report { overall_health: UNKNOWN } } }
W20251028 09:10:31.725304 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:31.727429 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:31.736210 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:31.795159 12156 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 7 FOLLOWER]: Refusing update from remote peer a10940ed9d79478faf0dac2c7b960184: Log matching property violated. Preceding OpId in replica: term: 5 index: 6964. Preceding OpId from leader: term: 7 index: 6968. (index mismatch)
I20251028 09:10:31.795465 12375 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6968, Last known committed idx: 6964, Time since last communication: 0.000s
I20251028 09:10:31.797875 12022 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 7 FOLLOWER]: Refusing update from remote peer a10940ed9d79478faf0dac2c7b960184: Log matching property violated. Preceding OpId in replica: term: 5 index: 6967. Preceding OpId from leader: term: 7 index: 6968. (index mismatch)
I20251028 09:10:31.798357 12384 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Connected to new peer: Peer: permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6968, Last known committed idx: 6967, Time since last communication: 0.000s
W20251028 09:10:31.819686 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:31.821116 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:31.821430 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:32.622043 10101 scanner-internal.cc:458] Time spent opening tablet: real 4.022s	user 0.001s	sys 0.001s
I20251028 09:10:35.428300 12136 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20251028 09:10:35.432366 11866 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:10:35.437296 12260 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:35.447978 11984 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:35.781044  9475 ts_manager.cc:284] Unset tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 from MAINTENANCE_MODE
I20251028 09:10:35.802197 12202 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:35.802345 12068 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:35.816184 11933 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:35.838898  9475 ts_manager.cc:284] Unset tserver state for a10940ed9d79478faf0dac2c7b960184 from MAINTENANCE_MODE
I20251028 09:10:35.886627  9475 ts_manager.cc:284] Unset tserver state for 817e1b13d456460eb9915890dd578911 from MAINTENANCE_MODE
I20251028 09:10:35.917634  9475 ts_manager.cc:284] Unset tserver state for bdeedf35d61a49bda10c70688541a0ea from MAINTENANCE_MODE
I20251028 09:10:35.966219 12335 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:36.318707  9475 ts_manager.cc:295] Set tserver state for a10940ed9d79478faf0dac2c7b960184 to MAINTENANCE_MODE
I20251028 09:10:36.341055  9475 ts_manager.cc:295] Set tserver state for bdeedf35d61a49bda10c70688541a0ea to MAINTENANCE_MODE
I20251028 09:10:36.373068  9475 ts_manager.cc:295] Set tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 to MAINTENANCE_MODE
I20251028 09:10:36.398917  9475 ts_manager.cc:295] Set tserver state for 817e1b13d456460eb9915890dd578911 to MAINTENANCE_MODE
I20251028 09:10:36.695885 11984 tablet_service.cc:1460] Tablet server 817e1b13d456460eb9915890dd578911 set to quiescing
I20251028 09:10:36.695955 11984 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:36.697168 12260 tablet_service.cc:1460] Tablet server bdeedf35d61a49bda10c70688541a0ea set to quiescing
I20251028 09:10:36.697230 12260 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:36.720897 11866 tablet_service.cc:1460] Tablet server a10940ed9d79478faf0dac2c7b960184 set to quiescing
I20251028 09:10:36.720978 11866 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:10:36.723443 12398 raft_consensus.cc:993] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: : Instructing follower 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:36.725221 12398 raft_consensus.cc:1081] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 7 LEADER]: Signalling peer 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:36.726392 12156 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2"
dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
 from {username='slave'} at 127.8.22.130:39505
I20251028 09:10:36.726496 12156 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 7 FOLLOWER]: Starting forced leader election (received explicit request)
I20251028 09:10:36.726528 12156 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 7 FOLLOWER]: Advancing to term 8
I20251028 09:10:36.727084 12398 raft_consensus.cc:993] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: : Instructing follower 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:36.727154 12398 raft_consensus.cc:1081] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 7 LEADER]: Signalling peer 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:36.727414 12155 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2"
dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
 from {username='slave'} at 127.8.22.130:39505
I20251028 09:10:36.728559 12156 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 8 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:36.728896 12155 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 8 FOLLOWER]: Starting forced leader election (received explicit request)
I20251028 09:10:36.728945 12155 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 8 FOLLOWER]: Advancing to term 9
I20251028 09:10:36.729789 12155 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:36.730173 12155 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 9 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:36.730273 12154 raft_consensus.cc:1240] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Rejecting Update request from peer a10940ed9d79478faf0dac2c7b960184 for earlier term 7. Current term is 9. Ops: [7.11032-7.11032]
I20251028 09:10:36.730595 12021 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 9 candidate_status { last_received { term: 7 index: 11031 } } ignore_live_leader: true dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:36.730670 12021 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 7 FOLLOWER]: Advancing to term 9
I20251028 09:10:36.730906 11887 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 9 candidate_status { last_received { term: 7 index: 11031 } } ignore_live_leader: true dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
I20251028 09:10:36.731017 11887 raft_consensus.cc:3055] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 7 LEADER]: Stepping down as leader of term 7
I20251028 09:10:36.731050 11887 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 7 LEADER]: Becoming Follower/Learner. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Running, Role: LEADER
I20251028 09:10:36.731101 11887 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 11034, Committed index: 11034, Last appended: 7.11034, Last appended by leader: 11034, Current term: 7, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:36.731189 11887 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 7 FOLLOWER]: Advancing to term 9
I20251028 09:10:36.731273 12156 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 8 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:36.731472 12021 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 9 because replica has last-logged OpId of term: 7 index: 11034, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 11031.
I20251028 09:10:36.731645 11886 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 8 candidate_status { last_received { term: 7 index: 11031 } } ignore_live_leader: true dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
I20251028 09:10:36.731948 12021 raft_consensus.cc:1240] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Rejecting Update request from peer a10940ed9d79478faf0dac2c7b960184 for earlier term 7. Current term is 9. Ops: []
I20251028 09:10:36.731999 11887 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 9 because replica has last-logged OpId of term: 7 index: 11034, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 11031.
I20251028 09:10:36.732816 12088 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 9 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1; no voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184
I20251028 09:10:36.732928 12021 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 8 candidate_status { last_received { term: 7 index: 11031 } } ignore_live_leader: true dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:36.733011 12021 raft_consensus.cc:2368] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for earlier term 8. Current term is 9.
W20251028 09:10:36.733443 11846 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.733461 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:36.733703 12088 leader_election.cc:400] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 8 election: Vote denied by peer 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271) with higher term. Message: Invalid argument: T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for earlier term 8. Current term is 9.
I20251028 09:10:36.733748 12088 leader_election.cc:403] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 8 election: Cancelling election due to peer responding with higher term
W20251028 09:10:36.733863 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:36.734148 12579 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Leader election lost for term 9. Reason: could not achieve majority
I20251028 09:10:36.734241 12579 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Leader election lost for term 8. Reason: Vote denied by peer 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271) with higher term. Message: Invalid argument: T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for earlier term 8. Current term is 9.
W20251028 09:10:36.736344 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.737457 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.738524 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.740135 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.742190 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.743237 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.746088 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.753926 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.756174 11846 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.756635 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.759543 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.762162 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.763253 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.766372 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.770720 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.771632 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.775975 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:36.777558 12136 tablet_service.cc:1460] Tablet server 3fedf3ecbec146b6a6ba544511a16fa1 set to quiescing
I20251028 09:10:36.777617 12136 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
W20251028 09:10:36.781040 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.781982 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.788893 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.793452 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.793464 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.800801 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.803340 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:36.803581 12068 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:36.804080 12202 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
W20251028 09:10:36.804242 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.814193 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.815192 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.815299 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:36.817534 11933 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
W20251028 09:10:36.826895 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.828959 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.829167 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.841477 12112 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.843683 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.845173 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.858696 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.859551 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.861104 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.876230 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.876454 11980 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.880033 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.894119 12112 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.894119 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.897164 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.913317 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.914150 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.916760 11980 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.932849 11980 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.933029 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.934549 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.951745 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.952702 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.955478 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.972612 11846 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.972612 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.977244 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.992357 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:36.993376 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.001147 12112 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.015133 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.017352 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.026274 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.030118 12590 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:37.037030 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.043223 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.052856 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.059898 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.065994 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.069610 12384 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:37.073303 12578 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:37.077661 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.087568 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.092813 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.106676 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.115630 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.120985 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.132699 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.142895 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.149915 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.159526 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.172586 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.178752 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.190507 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.203575 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.207703 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.223786 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.233835 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.236944 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.255605 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.266613 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.268700 12112 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.290676 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.300576 11846 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.300602 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.324275 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.333427 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.335392 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.359062 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.368096 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.368260 12112 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.396194 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.405208 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.405974 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.434577 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.440651 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.444749 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.472365 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.476529 12112 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.480616 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.511415 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.513360 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.517344 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.550981 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.554020 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.555081 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.591663 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.592751 12112 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.594853 12113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.634797 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.635782 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.636672 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.678308 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.680370 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.681667 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.720427 12116 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.722402 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.726553 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.766546 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.767500 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.769299 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.811134 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.813144 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.816290 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.858186 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.859133 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.861302 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:37.894343 11866 tablet_service.cc:1460] Tablet server a10940ed9d79478faf0dac2c7b960184 set to quiescing
I20251028 09:10:37.894407 11866 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251028 09:10:37.904909 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.905594 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.907971 11844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44272: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:37.941919 12136 tablet_service.cc:1460] Tablet server 3fedf3ecbec146b6a6ba544511a16fa1 set to quiescing
I20251028 09:10:37.941990 12136 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251028 09:10:37.954704 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.956642 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:37.957777 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:37.997946  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 11803
W20251028 09:10:38.004367 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:38.007607 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:38.008427  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.130:41397
--local_ip_for_outbound_sockets=127.8.22.130
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36451
--webserver_interface=127.8.22.130
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251028 09:10:38.008654 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:38.058817 10072 meta_cache.cc:302] tablet cd9b9f67e7d142db81a1c8be59070ef2: replica a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397) has failed: Network error: Client connection negotiation failed: client connection to 127.8.22.130:41397: connect: Connection refused (error 111)
W20251028 09:10:38.089658 12613 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:38.089855 12613 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:38.089877 12613 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:38.091341 12613 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:38.091395 12613 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.130
I20251028 09:10:38.093050 12613 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.130:41397
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.8.22.130
--webserver_port=36451
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.12613
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.130
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:38.093290 12613 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:38.093523 12613 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:38.096360 12620 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:38.096493 12619 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:38.096560 12622 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:38.096611 12613 server_base.cc:1047] running on GCE node
I20251028 09:10:38.096904 12613 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:38.097131 12613 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:38.098286 12613 hybrid_clock.cc:648] HybridClock initialized: now 1761642638098272 us; error 33 us; skew 500 ppm
I20251028 09:10:38.099648 12613 webserver.cc:492] Webserver started at http://127.8.22.130:36451/ using document root <none> and password file <none>
I20251028 09:10:38.099888 12613 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:38.099934 12613 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:38.101152 12613 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:38.101887 12628 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:38.102093 12613 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:38.102169 12613 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
uuid: "a10940ed9d79478faf0dac2c7b960184"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:38.102442 12613 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251028 09:10:38.110563 11980 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:38.110563 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:38.110823 12613 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:38.111083 12613 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:38.111196 12613 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:38.111395 12613 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:38.111871 12635 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:38.112748 12613 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:38.112805 12613 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:38.112849 12613 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:38.113459 12613 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:38.113535 12613 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
W20251028 09:10:38.113554 11981 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38532: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:38.113556 12635 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap starting.
I20251028 09:10:38.119701 12613 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.130:41397
I20251028 09:10:38.119766 12742 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.130:41397 every 8 connection(s)
I20251028 09:10:38.120083 12613 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
I20251028 09:10:38.123787  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 12613
I20251028 09:10:38.123890  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 11938
I20251028 09:10:38.131112 12743 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:38.131263 12743 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:38.131551 12743 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:38.132194  9478 ts_manager.cc:194] Re-registered known tserver with Master: a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:38.132809  9478 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.130:55503
I20251028 09:10:38.134588  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.129:39271
--local_ip_for_outbound_sockets=127.8.22.129
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=38771
--webserver_interface=127.8.22.129
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:38.159341 12635 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Log is configured to *not* fsync() on all Append() calls
W20251028 09:10:38.162968 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:38.164947 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:38.169152 12114 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47914: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:38.219864 12747 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:38.220062 12747 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:38.220095 12747 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:38.221674 12747 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:38.221764 12747 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.129
I20251028 09:10:38.223693 12747 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.129:39271
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.8.22.129
--webserver_port=38771
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.12747
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.129
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:38.223935 12747 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:38.224165 12747 file_cache.cc:492] Constructed file cache file cache with capacity 419430
I20251028 09:10:38.227003 12747 server_base.cc:1047] running on GCE node
W20251028 09:10:38.227110 12757 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:38.226991 12755 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:38.226971 12754 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:38.227356 12747 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:38.227566 12747 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:38.228706 12747 hybrid_clock.cc:648] HybridClock initialized: now 1761642638228691 us; error 33 us; skew 500 ppm
I20251028 09:10:38.229851 12747 webserver.cc:492] Webserver started at http://127.8.22.129:38771/ using document root <none> and password file <none>
I20251028 09:10:38.230041 12747 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:38.230078 12747 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:38.231369 12747 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:38.232023 12763 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:38.232219 12747 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20251028 09:10:38.232312 12747 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
uuid: "817e1b13d456460eb9915890dd578911"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:38.232653 12747 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:38.246352 12747 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:38.246667 12747 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:38.246814 12747 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:38.247083 12747 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:38.247601 12770 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:38.248497 12747 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:38.248557 12747 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:38.248595 12747 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:38.249424 12747 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:38.249472 12747 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:38.249554 12770 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap starting.
I20251028 09:10:38.257354 12747 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.129:39271
I20251028 09:10:38.257519 12877 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.129:39271 every 8 connection(s)
I20251028 09:10:38.260155  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 12747
I20251028 09:10:38.260268  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 12072
I20251028 09:10:38.257769 12747 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
I20251028 09:10:38.277143  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.132:43959
--local_ip_for_outbound_sockets=127.8.22.132
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36911
--webserver_interface=127.8.22.132
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:38.297904 12878 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:38.298022 12878 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:38.298278 12878 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:38.298938  9478 ts_manager.cc:194] Re-registered known tserver with Master: 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271)
I20251028 09:10:38.299515  9478 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.129:51683
I20251028 09:10:38.322381 10101 meta_cache.cc:1510] marking tablet server 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959) as failed
I20251028 09:10:38.322378 12770 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:38.387902 10100 meta_cache.cc:1510] marking tablet server 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959) as failed
I20251028 09:10:38.397441 10102 meta_cache.cc:1510] marking tablet server 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959) as failed
W20251028 09:10:38.411248 12880 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:38.411432 12880 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:38.411468 12880 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:38.412984 12880 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:38.413053 12880 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.132
I20251028 09:10:38.414619 12880 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.132:43959
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.8.22.132
--webserver_port=36911
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.12880
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.132
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:38.414887 12880 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:38.415170 12880 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:38.417795 12891 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:38.417819 12888 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:38.418135 12889 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:38.418231 12880 server_base.cc:1047] running on GCE node
I20251028 09:10:38.418476 12880 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:38.418685 12880 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:38.419862 12880 hybrid_clock.cc:648] HybridClock initialized: now 1761642638419841 us; error 56 us; skew 500 ppm
I20251028 09:10:38.421026 12880 webserver.cc:492] Webserver started at http://127.8.22.132:36911/ using document root <none> and password file <none>
I20251028 09:10:38.421232 12880 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:38.421280 12880 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:38.423324 12880 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:38.424185 12897 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:38.424322 12880 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251028 09:10:38.424383 12880 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:38.424659 12880 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:38.432865 12880 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:38.433096 12880 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:38.433200 12880 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:38.433391 12880 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:38.436118 12904 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:38.437006 12880 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:38.437052 12880 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.003s	user 0.000s	sys 0.000s
I20251028 09:10:38.437088 12880 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:38.437606 12880 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:38.437652 12880 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:38.437804 12904 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap starting.
I20251028 09:10:38.444868 12880 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.132:43959
I20251028 09:10:38.445287 12880 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
I20251028 09:10:38.448429 13011 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.132:43959 every 8 connection(s)
I20251028 09:10:38.452781  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 12880
I20251028 09:10:38.452879  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 12205
I20251028 09:10:38.461388  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.131:39559
--local_ip_for_outbound_sockets=127.8.22.131
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36027
--webserver_interface=127.8.22.131
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:38.465319 13012 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:38.465446 13012 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:38.465680 13012 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:38.466236  9478 ts_manager.cc:194] Re-registered known tserver with Master: 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:38.466787  9478 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.132:58275
I20251028 09:10:38.500231 12904 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Log is configured to *not* fsync() on all Append() calls
W20251028 09:10:38.588827 13016 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:38.589111 13016 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:38.589148 13016 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:38.591806 13016 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:38.591907 13016 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.131
I20251028 09:10:38.594715 13016 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.131:39559
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.8.22.131
--webserver_port=36027
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.13016
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.131
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:38.595062 13016 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:38.595362 13016 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:38.599485 13023 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:38.600073 13025 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:38.600329 13022 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:38.603147 13016 server_base.cc:1047] running on GCE node
I20251028 09:10:38.603420 13016 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:38.603690 13016 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:38.604864 13016 hybrid_clock.cc:648] HybridClock initialized: now 1761642638604837 us; error 40 us; skew 500 ppm
I20251028 09:10:38.606555 13016 webserver.cc:492] Webserver started at http://127.8.22.131:36027/ using document root <none> and password file <none>
I20251028 09:10:38.606813 13016 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:38.606880 13016 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:38.608870 13016 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.002s	sys 0.000s
I20251028 09:10:38.609639 13031 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:38.609794 13016 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251028 09:10:38.609879 13016 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
uuid: "bdeedf35d61a49bda10c70688541a0ea"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:38.610224 13016 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:38.656301 13016 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:38.656611 13016 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:38.656742 13016 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:38.656975 13016 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:38.657351 13016 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:10:38.657402 13016 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:38.657438 13016 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:10:38.657471 13016 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:38.665037 13016 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.131:39559
I20251028 09:10:38.665479 13016 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
I20251028 09:10:38.665735  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 13016
I20251028 09:10:38.669252 13144 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.131:39559 every 8 connection(s)
I20251028 09:10:38.689826 13146 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:38.689952 13146 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:38.690204 13146 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:38.690568  9478 ts_manager.cc:194] Re-registered known tserver with Master: bdeedf35d61a49bda10c70688541a0ea (127.8.22.131:39559)
I20251028 09:10:38.691115  9478 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.131:44643
I20251028 09:10:38.863409 12945 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:38.876941 12794 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:38.878823 13079 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:38.888118 12677 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:39.134219 12743 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:39.300513 12878 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:39.449653 12635 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 1/3 log segments. Stats: ops{read=4623 overwritten=0 applied=4620 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251028 09:10:39.467774 13012 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:39.483805 12904 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 1/3 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:39.597013 12770 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 1/3 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:39.692028 13146 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:40.392232 12904 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 2/3 log segments. Stats: ops{read=9374 overwritten=0 applied=9372 ignored=0} inserts{seen=468400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:40.713884 12904 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 3/3 log segments. Stats: ops{read=11031 overwritten=0 applied=11031 ignored=0} inserts{seen=551350 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:40.714349 12904 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap complete.
I20251028 09:10:40.718907 12904 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent bootstrapping tablet: real 2.281s	user 1.887s	sys 0.359s
I20251028 09:10:40.720023 12904 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:40.720224 12904 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fedf3ecbec146b6a6ba544511a16fa1, State: Initialized, Role: FOLLOWER
I20251028 09:10:40.720327 12904 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11031, Last appended: 7.11031, Last appended by leader: 11031, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:40.720608 12904 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent starting tablet: real 0.002s	user 0.001s	sys 0.000s
W20251028 09:10:40.746542 12920 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:40.832055 12635 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 2/3 log segments. Stats: ops{read=9245 overwritten=0 applied=9243 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:40.925307 12770 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 2/3 log segments. Stats: ops{read=9360 overwritten=0 applied=9359 ignored=0} inserts{seen=467750 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
W20251028 09:10:40.940006 12920 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:40.940024 12919 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:41.022342 12919 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:41.030444 13185 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:41.030586 13185 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:41.030936 13185 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 10 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:41.035688 12832 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 10 candidate_status { last_received { term: 7 index: 11031 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
I20251028 09:10:41.035962 12697 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 10 candidate_status { last_received { term: 7 index: 11031 } } ignore_live_leader: false dest_uuid: "a10940ed9d79478faf0dac2c7b960184" is_pre_election: true
W20251028 09:10:41.036867 12898 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 10 pre-election: Tablet error from VoteRequest() call to peer 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271): Illegal state: must be running to vote when last-logged opid is not known
W20251028 09:10:41.036989 12899 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 10 pre-election: Tablet error from VoteRequest() call to peer a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397): Illegal state: must be running to vote when last-logged opid is not known
I20251028 09:10:41.037029 12899 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 10 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1; no voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184
I20251028 09:10:41.037161 13185 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Leader pre-election lost for term 10. Reason: could not achieve majority
I20251028 09:10:41.183704 12635 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 3/3 log segments. Stats: ops{read=11034 overwritten=0 applied=11034 ignored=0} inserts{seen=551500 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:41.184183 12635 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap complete.
I20251028 09:10:41.188813 12635 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent bootstrapping tablet: real 3.075s	user 2.589s	sys 0.440s
I20251028 09:10:41.189384 12635 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 9 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:41.189586 12635 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 9 FOLLOWER]: Becoming Follower/Learner. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Initialized, Role: FOLLOWER
I20251028 09:10:41.189673 12635 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11034, Last appended: 7.11034, Last appended by leader: 11034, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:41.189914 12635 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent starting tablet: real 0.001s	user 0.002s	sys 0.000s
W20251028 09:10:41.216944 12919 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:41.223133 12919 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:41.240566 12770 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 3/3 log segments. Stats: ops{read=11034 overwritten=0 applied=11032 ignored=0} inserts{seen=551400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:41.241065 12770 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap complete.
I20251028 09:10:41.245533 12770 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent bootstrapping tablet: real 2.996s	user 2.525s	sys 0.431s
I20251028 09:10:41.246366 12770 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:41.247035 12770 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Becoming Follower/Learner. State: Replica: 817e1b13d456460eb9915890dd578911, State: Initialized, Role: FOLLOWER
I20251028 09:10:41.247152 12770 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11032, Last appended: 7.11034, Last appended by leader: 11034, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:41.247372 12770 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent starting tablet: real 0.002s	user 0.000s	sys 0.004s
W20251028 09:10:41.302487 12919 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:41.312252 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:41.318912 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:41.377977 13185 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:41.378098 13185 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:41.378322 13185 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 10 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:41.378561 12697 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 10 candidate_status { last_received { term: 7 index: 11031 } } ignore_live_leader: false dest_uuid: "a10940ed9d79478faf0dac2c7b960184" is_pre_election: true
I20251028 09:10:41.378582 12832 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 10 candidate_status { last_received { term: 7 index: 11031 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
I20251028 09:10:41.378677 12697 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 9 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 10 because replica has last-logged OpId of term: 7 index: 11034, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 11031.
I20251028 09:10:41.378716 12832 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 10 because replica has last-logged OpId of term: 7 index: 11034, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 11031.
I20251028 09:10:41.378990 12898 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 10 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1; no voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184
I20251028 09:10:41.379091 13185 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Leader pre-election lost for term 10. Reason: could not achieve majority
W20251028 09:10:41.397341 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:41.406279 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:41.413251 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:41.491045 13194 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:41.491194 13194 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:41.491513 13194 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 10 pre-election: Requested pre-vote from peers a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:41.495445 12697 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "817e1b13d456460eb9915890dd578911" candidate_term: 10 candidate_status { last_received { term: 7 index: 11034 } } ignore_live_leader: false dest_uuid: "a10940ed9d79478faf0dac2c7b960184" is_pre_election: true
I20251028 09:10:41.495615 12697 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 9 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 817e1b13d456460eb9915890dd578911 in term 9.
I20251028 09:10:41.495568 12963 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "817e1b13d456460eb9915890dd578911" candidate_term: 10 candidate_status { last_received { term: 7 index: 11034 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" is_pre_election: true
I20251028 09:10:41.495682 12963 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 817e1b13d456460eb9915890dd578911 in term 9.
I20251028 09:10:41.495858 12765 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 10 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184; no voters: 
I20251028 09:10:41.495993 13194 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Leader pre-election won for term 10
I20251028 09:10:41.496065 13194 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:10:41.496085 13194 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 9 FOLLOWER]: Advancing to term 10
W20251028 09:10:41.496459 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:41.497311 13194 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 10 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:41.497577 13194 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 10 election: Requested vote from peers a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:41.497617 12697 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "817e1b13d456460eb9915890dd578911" candidate_term: 10 candidate_status { last_received { term: 7 index: 11034 } } ignore_live_leader: false dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
I20251028 09:10:41.497608 12963 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "817e1b13d456460eb9915890dd578911" candidate_term: 10 candidate_status { last_received { term: 7 index: 11034 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
I20251028 09:10:41.497687 12963 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 9 FOLLOWER]: Advancing to term 10
I20251028 09:10:41.497684 12697 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 9 FOLLOWER]: Advancing to term 10
I20251028 09:10:41.498845 12963 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 10 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 817e1b13d456460eb9915890dd578911 in term 10.
I20251028 09:10:41.498845 12697 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 10 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 817e1b13d456460eb9915890dd578911 in term 10.
I20251028 09:10:41.499097 12767 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 10 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1, 817e1b13d456460eb9915890dd578911; no voters: 
I20251028 09:10:41.499225 13194 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 10 FOLLOWER]: Leader election won for term 10
I20251028 09:10:41.499364 13194 raft_consensus.cc:697] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 10 LEADER]: Becoming Leader. State: Replica: 817e1b13d456460eb9915890dd578911, State: Running, Role: LEADER
I20251028 09:10:41.499558 13194 consensus_queue.cc:237] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 11032, Committed index: 11032, Last appended: 7.11034, Last appended by leader: 11034, Current term: 10, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:41.500221  9478 catalog_manager.cc:5649] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 reported cstate change: term changed from 7 to 10, leader changed from a10940ed9d79478faf0dac2c7b960184 (127.8.22.130) to 817e1b13d456460eb9915890dd578911 (127.8.22.129). New cstate: current_term: 10 leader_uuid: "817e1b13d456460eb9915890dd578911" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } health_report { overall_health: UNKNOWN } } }
W20251028 09:10:41.503232 12919 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:41.510344 12919 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:41.592727 12697 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 10 FOLLOWER]: Refusing update from remote peer 817e1b13d456460eb9915890dd578911: Log matching property violated. Preceding OpId in replica: term: 7 index: 11034. Preceding OpId from leader: term: 10 index: 11035. (index mismatch)
W20251028 09:10:41.592772 12919 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:41.593147 13194 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11035, Last known committed idx: 11034, Time since last communication: 0.000s
I20251028 09:10:41.595103 12963 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 10 FOLLOWER]: Refusing update from remote peer 817e1b13d456460eb9915890dd578911: Log matching property violated. Preceding OpId in replica: term: 7 index: 11031. Preceding OpId from leader: term: 10 index: 11035. (index mismatch)
I20251028 09:10:41.595461 13194 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11035, Last known committed idx: 11031, Time since last communication: 0.000s
W20251028 09:10:41.927258 10101 scanner-internal.cc:458] Time spent opening tablet: real 4.008s	user 0.001s	sys 0.001s
W20251028 09:10:41.993646 10100 scanner-internal.cc:458] Time spent opening tablet: real 4.010s	user 0.001s	sys 0.001s
W20251028 09:10:42.002462 10102 scanner-internal.cc:458] Time spent opening tablet: real 4.007s	user 0.001s	sys 0.000s
I20251028 09:10:44.154330 12945 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20251028 09:10:44.156069 12794 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:10:44.157465 13079 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:44.178172 12677 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:44.607384  9478 ts_manager.cc:284] Unset tserver state for bdeedf35d61a49bda10c70688541a0ea from MAINTENANCE_MODE
I20251028 09:10:44.608016 13012 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:44.608112 12878 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:44.608995 12743 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:44.614374  9473 ts_manager.cc:284] Unset tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 from MAINTENANCE_MODE
I20251028 09:10:44.626291  9473 ts_manager.cc:284] Unset tserver state for 817e1b13d456460eb9915890dd578911 from MAINTENANCE_MODE
I20251028 09:10:44.631076  9473 ts_manager.cc:284] Unset tserver state for a10940ed9d79478faf0dac2c7b960184 from MAINTENANCE_MODE
I20251028 09:10:44.695837 13146 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:44.975915  9473 ts_manager.cc:295] Set tserver state for 817e1b13d456460eb9915890dd578911 to MAINTENANCE_MODE
I20251028 09:10:45.050020  9473 ts_manager.cc:295] Set tserver state for bdeedf35d61a49bda10c70688541a0ea to MAINTENANCE_MODE
I20251028 09:10:45.151374  9473 ts_manager.cc:295] Set tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 to MAINTENANCE_MODE
I20251028 09:10:45.180442  9473 ts_manager.cc:295] Set tserver state for a10940ed9d79478faf0dac2c7b960184 to MAINTENANCE_MODE
I20251028 09:10:45.397358 12794 tablet_service.cc:1460] Tablet server 817e1b13d456460eb9915890dd578911 set to quiescing
I20251028 09:10:45.397434 12794 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:10:45.397703 13079 tablet_service.cc:1460] Tablet server bdeedf35d61a49bda10c70688541a0ea set to quiescing
I20251028 09:10:45.397781 13079 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:45.398236 13319 raft_consensus.cc:993] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: : Instructing follower a10940ed9d79478faf0dac2c7b960184 to start an election
I20251028 09:10:45.398301 13319 raft_consensus.cc:1081] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 10 LEADER]: Signalling peer a10940ed9d79478faf0dac2c7b960184 to start an election
I20251028 09:10:45.398732 13194 raft_consensus.cc:993] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: : Instructing follower 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:45.398788 13194 raft_consensus.cc:1081] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 10 LEADER]: Signalling peer 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:45.399333 13320 raft_consensus.cc:993] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: : Instructing follower a10940ed9d79478faf0dac2c7b960184 to start an election
I20251028 09:10:45.399317 12696 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2"
dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
 from {username='slave'} at 127.8.22.129:55335
I20251028 09:10:45.399392 13320 raft_consensus.cc:1081] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 10 LEADER]: Signalling peer a10940ed9d79478faf0dac2c7b960184 to start an election
I20251028 09:10:45.399410 12696 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 10 FOLLOWER]: Starting forced leader election (received explicit request)
I20251028 09:10:45.399446 12696 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 10 FOLLOWER]: Advancing to term 11
I20251028 09:10:45.399646 12697 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2"
dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
 from {username='slave'} at 127.8.22.129:55335
I20251028 09:10:45.399842 13259 raft_consensus.cc:993] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: : Instructing follower 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:45.399890 13259 raft_consensus.cc:1081] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 10 LEADER]: Signalling peer 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:45.399864 12963 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2"
dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
 from {username='slave'} at 127.8.22.129:47095
I20251028 09:10:45.399942 12963 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 10 FOLLOWER]: Starting forced leader election (received explicit request)
I20251028 09:10:45.399972 12963 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 10 FOLLOWER]: Advancing to term 11
I20251028 09:10:45.400300 12962 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2"
dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
 from {username='slave'} at 127.8.22.129:47095
I20251028 09:10:45.400316 12696 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 11 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:45.400892 12697 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 11 FOLLOWER]: Starting forced leader election (received explicit request)
I20251028 09:10:45.400941 12697 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 11 FOLLOWER]: Advancing to term 12
I20251028 09:10:45.401494 12963 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 11 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:45.401607 12962 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 11 FOLLOWER]: Starting forced leader election (received explicit request)
I20251028 09:10:45.401649 12963 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 11 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:45.401656 12697 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:45.401827 12694 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 11 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: true dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
I20251028 09:10:45.401911 12697 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 12 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:45.402036 12832 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 11 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: true dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:45.401656 12962 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 11 FOLLOWER]: Advancing to term 12
I20251028 09:10:45.402097 12832 raft_consensus.cc:3055] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 10 LEADER]: Stepping down as leader of term 10
I20251028 09:10:45.402119 12832 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 10 LEADER]: Becoming Follower/Learner. State: Replica: 817e1b13d456460eb9915890dd578911, State: Running, Role: LEADER
I20251028 09:10:45.402136 12695 raft_consensus.cc:1240] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Rejecting Update request from peer 817e1b13d456460eb9915890dd578911 for earlier term 10. Current term is 12. Ops: [10.14074-10.14074]
I20251028 09:10:45.402191 12832 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 14073, Committed index: 14073, Last appended: 10.14075, Last appended by leader: 14075, Current term: 10, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:45.402264 12832 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 10 FOLLOWER]: Advancing to term 11
I20251028 09:10:45.402405 12696 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 11 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:45.402781 12962 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 12 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:45.402997 12832 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 11 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 11 because replica has last-logged OpId of term: 10 index: 14075, which is greater than that of the candidate, which has last-logged OpId of term: 10 index: 14073.
I20251028 09:10:45.403013 12962 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 12 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:45.403100 12961 raft_consensus.cc:1240] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 12 FOLLOWER]: Rejecting Update request from peer 817e1b13d456460eb9915890dd578911 for earlier term 10. Current term is 12. Ops: [10.14074-10.14074]
I20251028 09:10:45.403296 12898 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 11 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1; no voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184
I20251028 09:10:45.403421 12832 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 12 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: true dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:45.403487 12832 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 11 FOLLOWER]: Advancing to term 12
I20251028 09:10:45.403636 12696 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 12 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: true dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
I20251028 09:10:45.403703 12696 raft_consensus.cc:2393] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 in current term 12: Already voted for candidate a10940ed9d79478faf0dac2c7b960184 in this term.
W20251028 09:10:45.403941 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:45.404196 12832 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 12 because replica has last-logged OpId of term: 10 index: 14075, which is greater than that of the candidate, which has last-logged OpId of term: 10 index: 14073.
I20251028 09:10:45.404897 12898 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 12 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1; no voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184
I20251028 09:10:45.405078 13388 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 12 FOLLOWER]: Leader election lost for term 11. Reason: could not achieve majority
I20251028 09:10:45.405192 13388 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 12 FOLLOWER]: Leader election lost for term 12. Reason: could not achieve majority
I20251028 09:10:45.407702 12832 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 11 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: true dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:45.407794 12832 raft_consensus.cc:2368] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate a10940ed9d79478faf0dac2c7b960184 for earlier term 11. Current term is 12.
I20251028 09:10:45.407869 12831 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 12 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: true dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:45.407915 12831 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate a10940ed9d79478faf0dac2c7b960184 for term 12 because replica has last-logged OpId of term: 10 index: 14075, which is greater than that of the candidate, which has last-logged OpId of term: 10 index: 14073.
I20251028 09:10:45.408097 12629 leader_election.cc:400] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 11 election: Vote denied by peer 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271) with higher term. Message: Invalid argument: T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate a10940ed9d79478faf0dac2c7b960184 for earlier term 11. Current term is 12.
I20251028 09:10:45.408138 12629 leader_election.cc:403] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 11 election: Cancelling election due to peer responding with higher term
I20251028 09:10:45.408483 13390 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Leader election lost for term 11. Reason: Vote denied by peer 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271) with higher term. Message: Invalid argument: T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate a10940ed9d79478faf0dac2c7b960184 for earlier term 11. Current term is 12.
W20251028 09:10:45.408572 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:45.409943 12961 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 11 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: true dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
I20251028 09:10:45.410040 12961 raft_consensus.cc:2368] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate a10940ed9d79478faf0dac2c7b960184 for earlier term 11. Current term is 12.
I20251028 09:10:45.410176 12962 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 12 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: true dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
I20251028 09:10:45.410228 12962 raft_consensus.cc:2393] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate a10940ed9d79478faf0dac2c7b960184 in current term 12: Already voted for candidate 3fedf3ecbec146b6a6ba544511a16fa1 in this term.
I20251028 09:10:45.410229 12632 leader_election.cc:400] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 11 election: Vote denied by peer 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959) with higher term. Message: Invalid argument: T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate a10940ed9d79478faf0dac2c7b960184 for earlier term 11. Current term is 12.
I20251028 09:10:45.410413 12632 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 12 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: a10940ed9d79478faf0dac2c7b960184; no voters: 3fedf3ecbec146b6a6ba544511a16fa1, 817e1b13d456460eb9915890dd578911
I20251028 09:10:45.410569 13390 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Leader election lost for term 12. Reason: could not achieve majority
W20251028 09:10:45.415289 12920 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.419572 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.426178 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.435137 12918 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.442540 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.450155 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.458851 12918 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.470901 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.484524 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:45.493734 12945 tablet_service.cc:1460] Tablet server 3fedf3ecbec146b6a6ba544511a16fa1 set to quiescing
I20251028 09:10:45.493811 12945 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
W20251028 09:10:45.498095 12918 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:45.501744 12677 tablet_service.cc:1460] Tablet server a10940ed9d79478faf0dac2c7b960184 set to quiescing
I20251028 09:10:45.501823 12677 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251028 09:10:45.514218 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.531915 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.550786 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.568017 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.588851 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.607638 12918 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:45.609459 12878 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:45.609998 13012 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:45.610462 12743 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
W20251028 09:10:45.629127 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.649844 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.670514 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.692703 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.706799 13389 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:45.716545 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.725065 13320 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:45.740404 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.765563 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.793227 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.823978 12920 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.852209 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.882827 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.898237 13390 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:45.914521 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.947292 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:45.981259 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.014362 12918 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.050834 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.087881 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.126269 12918 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.165935 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.207906 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.246745 12918 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.289247 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.331326 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.373250 12918 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.417953 12792 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.463193 12656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.512501 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:46.556844 12794 tablet_service.cc:1460] Tablet server 817e1b13d456460eb9915890dd578911 set to quiescing
I20251028 09:10:46.556918 12794 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251028 09:10:46.560158 12791 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.611222 12657 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:44346: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:46.645398 12945 tablet_service.cc:1460] Tablet server 3fedf3ecbec146b6a6ba544511a16fa1 set to quiescing
I20251028 09:10:46.645479 12945 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251028 09:10:46.659317 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:46.701719  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 12613
W20251028 09:10:46.710820 12791 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38580: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
I20251028 09:10:46.712679  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.130:41397
--local_ip_for_outbound_sockets=127.8.22.130
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36451
--webserver_interface=127.8.22.130
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251028 09:10:46.761895 10072 meta_cache.cc:302] tablet cd9b9f67e7d142db81a1c8be59070ef2: replica a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397) has failed: Network error: Client connection negotiation failed: client connection to 127.8.22.130:41397: connect: Connection refused (error 111)
W20251028 09:10:46.794443 13432 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:46.794623 13432 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:46.794646 13432 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:46.796166 13432 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:46.796231 13432 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.130
I20251028 09:10:46.797801 13432 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.130:41397
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.8.22.130
--webserver_port=36451
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.13432
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.130
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:46.798038 13432 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:46.798285 13432 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:46.801366 13439 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:46.801374 13441 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:46.801420 13438 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:46.801535 13432 server_base.cc:1047] running on GCE node
I20251028 09:10:46.801703 13432 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:46.801908 13432 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:46.803018 13432 hybrid_clock.cc:648] HybridClock initialized: now 1761642646803000 us; error 39 us; skew 500 ppm
I20251028 09:10:46.804188 13432 webserver.cc:492] Webserver started at http://127.8.22.130:36451/ using document root <none> and password file <none>
I20251028 09:10:46.804396 13432 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:46.804445 13432 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:46.805729 13432 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:46.806484 13447 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:46.806691 13432 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:46.806758 13432 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
uuid: "a10940ed9d79478faf0dac2c7b960184"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:46.807082 13432 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251028 09:10:46.814723 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:46.820605 13432 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:46.820865 13432 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:46.820986 13432 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:46.821185 13432 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:46.821646 13454 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:46.822520 13432 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:46.822574 13432 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:46.822625 13432 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:46.823240 13432 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:46.823277 13432 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:46.823339 13454 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap starting.
I20251028 09:10:46.829804 13432 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.130:41397
I20251028 09:10:46.829891 13561 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.130:41397 every 8 connection(s)
I20251028 09:10:46.830196 13432 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
I20251028 09:10:46.834847 13454 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:46.838717  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 13432
I20251028 09:10:46.838832  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 12747
I20251028 09:10:46.838871 13562 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:46.838977 13562 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:46.839197 13562 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:46.839991  9475 ts_manager.cc:194] Re-registered known tserver with Master: a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:46.840390  9475 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.130:43811
I20251028 09:10:46.849718  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.129:39271
--local_ip_for_outbound_sockets=127.8.22.129
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=38771
--webserver_interface=127.8.22.129
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251028 09:10:46.852291 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.854291 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.878592 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.884908 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.903476 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.930474 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.939079 13567 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:46.939260 13567 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:46.939293 13567 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:46.939682 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:46.941126 13567 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:46.941195 13567 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.129
I20251028 09:10:46.942898 13567 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.129:39271
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.8.22.129
--webserver_port=38771
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.13567
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.129
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:46.943176 13567 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:46.943460 13567 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:46.946727 13574 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:46.946926 13576 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:46.947157 13573 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:46.947439 13567 server_base.cc:1047] running on GCE node
I20251028 09:10:46.947640 13567 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:46.947896 13567 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:46.949038 13567 hybrid_clock.cc:648] HybridClock initialized: now 1761642646949007 us; error 62 us; skew 500 ppm
I20251028 09:10:46.950572 13567 webserver.cc:492] Webserver started at http://127.8.22.129:38771/ using document root <none> and password file <none>
I20251028 09:10:46.950829 13567 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:46.950878 13567 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:46.952678 13567 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:46.953644 13582 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:46.953827 13567 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:46.953905 13567 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
uuid: "817e1b13d456460eb9915890dd578911"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:46.954238 13567 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:46.965263 13567 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:46.965552 13567 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:46.965694 13567 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:46.965947 13567 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:46.966430 13589 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:46.967372 13567 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:46.967433 13567 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:46.967468 13567 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:46.968199 13567 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:46.968243 13567 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:46.968322 13589 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap starting.
W20251028 09:10:46.973739 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:46.975888 13567 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.129:39271
I20251028 09:10:46.976302 13567 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
I20251028 09:10:46.982057 13697 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:46.982267 13697 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:46.982589 13697 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:46.983269  9475 ts_manager.cc:194] Re-registered known tserver with Master: 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271)
I20251028 09:10:46.983351 13696 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.129:39271 every 8 connection(s)
I20251028 09:10:46.983826  9475 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.129:54969
W20251028 09:10:46.984530 12926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36322: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:46.985098 13589 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Log is configured to *not* fsync() on all Append() calls
I20251028 09:10:46.985301  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 13567
I20251028 09:10:46.985379  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 12880
W20251028 09:10:47.001082 10074 connection.cc:537] client connection to 127.8.22.132:43959 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20251028 09:10:47.001439  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.132:43959
--local_ip_for_outbound_sockets=127.8.22.132
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36911
--webserver_interface=127.8.22.132
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251028 09:10:47.127125 13702 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:47.127384 13702 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:47.127422 13702 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:47.129999 13702 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:47.130102 13702 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.132
I20251028 09:10:47.132826 13702 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.132:43959
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.8.22.132
--webserver_port=36911
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.13702
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.132
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:47.133157 13702 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:47.133448 13702 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:47.136705 13707 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:47.136732 13710 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:47.137065 13708 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:47.138298 13702 server_base.cc:1047] running on GCE node
I20251028 09:10:47.138644 13702 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:47.139078 13702 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:47.140290 13702 hybrid_clock.cc:648] HybridClock initialized: now 1761642647140199 us; error 108 us; skew 500 ppm
I20251028 09:10:47.142210 13702 webserver.cc:492] Webserver started at http://127.8.22.132:36911/ using document root <none> and password file <none>
I20251028 09:10:47.142503 13702 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:47.142568 13702 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:47.145026 13702 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:47.146184 13716 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:47.146386 13702 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:47.146474 13702 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:47.146816 13702 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:47.173307 13702 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:47.173614 13702 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:47.173744 13702 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:47.174013 13702 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:47.174562 13723 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:47.175760 13702 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:47.175824 13702 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:47.175860 13702 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:47.176596 13702 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:47.176641 13702 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:47.176771 13723 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap starting.
I20251028 09:10:47.185406 13702 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.132:43959
I20251028 09:10:47.185838 13702 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
I20251028 09:10:47.186172 13830 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.132:43959 every 8 connection(s)
I20251028 09:10:47.188208  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 13702
I20251028 09:10:47.188303  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 13016
I20251028 09:10:47.191915 13831 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:47.192014 13831 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:47.192219 13831 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:47.192732  9475 ts_manager.cc:194] Re-registered known tserver with Master: 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:47.193284  9475 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.132:44253
I20251028 09:10:47.197002  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.131:39559
--local_ip_for_outbound_sockets=127.8.22.131
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36027
--webserver_interface=127.8.22.131
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:47.198217 13723 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Log is configured to *not* fsync() on all Append() calls
W20251028 09:10:47.329380 13835 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:47.329612 13835 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:47.329648 13835 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:47.332203 13835 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:47.332352 13835 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.131
I20251028 09:10:47.334913 13835 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.131:39559
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.8.22.131
--webserver_port=36027
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.13835
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.131
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:47.335747 13835 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:47.336081 13835 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:47.339751 13844 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:47.339946 13841 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:47.340152 13835 server_base.cc:1047] running on GCE node
W20251028 09:10:47.340498 13842 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:47.340737 13835 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:47.340950 13835 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:47.342104 13835 hybrid_clock.cc:648] HybridClock initialized: now 1761642647342072 us; error 44 us; skew 500 ppm
I20251028 09:10:47.343513 13835 webserver.cc:492] Webserver started at http://127.8.22.131:36027/ using document root <none> and password file <none>
I20251028 09:10:47.343750 13835 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:47.343797 13835 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:47.345382 13835 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:47.346093 13850 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:47.346280 13835 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:47.346342 13835 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
uuid: "bdeedf35d61a49bda10c70688541a0ea"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:47.346664 13835 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:47.362783 13835 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:47.363149 13835 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:47.363317 13835 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:47.363564 13835 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:47.363984 13835 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:10:47.364069 13835 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:47.364145 13835 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:10:47.364200 13835 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:47.370805 13835 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.131:39559
I20251028 09:10:47.371563 13835 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
I20251028 09:10:47.374186 13963 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.131:39559 every 8 connection(s)
I20251028 09:10:47.380622 13964 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:47.380779 13964 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:47.381047 13964 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:47.381546  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 13835
I20251028 09:10:47.383523  9475 ts_manager.cc:194] Re-registered known tserver with Master: bdeedf35d61a49bda10c70688541a0ea (127.8.22.131:39559)
I20251028 09:10:47.384012  9475 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.131:47147
I20251028 09:10:47.559420 13765 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:47.576237 13628 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:47.584223 13496 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:47.596748 13898 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:47.841288 13562 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:47.866919 13454 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4620 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251028 09:10:47.985371 13697 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:48.194267 13831 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:48.257771 13589 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:48.385185 13964 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:48.547123 13723 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:48.848347 13454 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 2/4 log segments. Stats: ops{read=9245 overwritten=0 applied=9243 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:49.473235 13589 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 2/4 log segments. Stats: ops{read=9245 overwritten=0 applied=9243 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:49.865784 13454 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 3/4 log segments. Stats: ops{read=14033 overwritten=0 applied=14033 ignored=0} inserts{seen=701400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:49.874029 13454 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 4/4 log segments. Stats: ops{read=14073 overwritten=0 applied=14073 ignored=0} inserts{seen=703400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:49.874524 13454 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap complete.
I20251028 09:10:49.880664 13454 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent bootstrapping tablet: real 3.057s	user 2.592s	sys 0.447s
I20251028 09:10:49.881633 13454 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:49.881839 13454 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Becoming Follower/Learner. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Initialized, Role: FOLLOWER
I20251028 09:10:49.881942 13454 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14073, Last appended: 10.14073, Last appended by leader: 14073, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:49.882201 13454 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent starting tablet: real 0.001s	user 0.003s	sys 0.000s
I20251028 09:10:49.955590 13723 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 2/4 log segments. Stats: ops{read=9246 overwritten=0 applied=9244 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
W20251028 09:10:49.971905 13476 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:50.029758 13476 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:50.064313 10101 scanner-internal.cc:458] Time spent opening tablet: real 4.008s	user 0.001s	sys 0.001s
W20251028 09:10:50.094799 10102 scanner-internal.cc:458] Time spent opening tablet: real 4.008s	user 0.002s	sys 0.000s
W20251028 09:10:50.101863 10100 scanner-internal.cc:458] Time spent opening tablet: real 4.015s	user 0.002s	sys 0.000s
I20251028 09:10:50.142704 14006 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:50.142840 14006 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:50.143435 14006 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 13 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:50.163141 13635 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 13 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
I20251028 09:10:50.163290 13784 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 13 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" is_pre_election: true
W20251028 09:10:50.164279 13451 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 13 pre-election: Tablet error from VoteRequest() call to peer 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959): Illegal state: must be running to vote when last-logged opid is not known
W20251028 09:10:50.164386 13448 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 13 pre-election: Tablet error from VoteRequest() call to peer 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271): Illegal state: must be running to vote when last-logged opid is not known
I20251028 09:10:50.164445 13448 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 13 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: a10940ed9d79478faf0dac2c7b960184; no voters: 3fedf3ecbec146b6a6ba544511a16fa1, 817e1b13d456460eb9915890dd578911
I20251028 09:10:50.164590 14006 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Leader pre-election lost for term 13. Reason: could not achieve majority
W20251028 09:10:50.165567 13474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:50.216868 13476 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:50.271632 13476 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:50.460392 13475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:50.466809 13475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:50.526361 13475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:50.578202 14006 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:50.578311 14006 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:50.578469 14006 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 13 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:50.578740 13635 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 13 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
W20251028 09:10:50.579147 13448 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 13 pre-election: Tablet error from VoteRequest() call to peer 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271): Illegal state: must be running to vote when last-logged opid is not known
I20251028 09:10:50.578939 13784 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 13 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" is_pre_election: true
W20251028 09:10:50.579515 13451 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 13 pre-election: Tablet error from VoteRequest() call to peer 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959): Illegal state: must be running to vote when last-logged opid is not known
I20251028 09:10:50.579567 13451 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 13 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: a10940ed9d79478faf0dac2c7b960184; no voters: 3fedf3ecbec146b6a6ba544511a16fa1, 817e1b13d456460eb9915890dd578911
I20251028 09:10:50.579661 14006 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Leader pre-election lost for term 13. Reason: could not achieve majority
W20251028 09:10:50.667150 13475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:50.725450 13475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:50.765292 13589 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 3/4 log segments. Stats: ops{read=13869 overwritten=0 applied=13866 ignored=0} inserts{seen=693050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251028 09:10:50.813763 13589 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 4/4 log segments. Stats: ops{read=14075 overwritten=0 applied=14073 ignored=0} inserts{seen=703400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:50.814347 13589 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap complete.
I20251028 09:10:50.820851 13589 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent bootstrapping tablet: real 3.853s	user 3.329s	sys 0.485s
I20251028 09:10:50.821542 13589 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:50.822306 13589 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Becoming Follower/Learner. State: Replica: 817e1b13d456460eb9915890dd578911, State: Initialized, Role: FOLLOWER
I20251028 09:10:50.822484 13589 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14073, Last appended: 10.14075, Last appended by leader: 14075, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:50.822794 13589 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent starting tablet: real 0.002s	user 0.001s	sys 0.000s
W20251028 09:10:50.873193 13606 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41518: Illegal state: replica 817e1b13d456460eb9915890dd578911 is not leader of this config: current role FOLLOWER
W20251028 09:10:50.874818 13473 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
W20251028 09:10:50.977526 13474 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:40886: Illegal state: replica a10940ed9d79478faf0dac2c7b960184 is not leader of this config: current role FOLLOWER
I20251028 09:10:50.990865 14006 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:50.990993 14006 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:50.991153 14006 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 13 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:50.991286 13635 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 13 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
I20251028 09:10:50.991412 13635 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate a10940ed9d79478faf0dac2c7b960184 for term 13 because replica has last-logged OpId of term: 10 index: 14075, which is greater than that of the candidate, which has last-logged OpId of term: 10 index: 14073.
I20251028 09:10:50.991393 13784 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 13 candidate_status { last_received { term: 10 index: 14073 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" is_pre_election: true
W20251028 09:10:50.991820 13451 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 13 pre-election: Tablet error from VoteRequest() call to peer 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959): Illegal state: must be running to vote when last-logged opid is not known
I20251028 09:10:50.991879 13451 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 13 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: a10940ed9d79478faf0dac2c7b960184; no voters: 3fedf3ecbec146b6a6ba544511a16fa1, 817e1b13d456460eb9915890dd578911
I20251028 09:10:50.991946 14006 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Leader pre-election lost for term 13. Reason: could not achieve majority
I20251028 09:10:51.105996 14015 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:51.106156 14015 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:51.106478 14015 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 13 pre-election: Requested pre-vote from peers a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:51.112190 13784 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "817e1b13d456460eb9915890dd578911" candidate_term: 13 candidate_status { last_received { term: 10 index: 14075 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" is_pre_election: true
W20251028 09:10:51.112557 13586 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 13 pre-election: Tablet error from VoteRequest() call to peer 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959): Illegal state: must be running to vote when last-logged opid is not known
I20251028 09:10:51.112680 13516 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "817e1b13d456460eb9915890dd578911" candidate_term: 13 candidate_status { last_received { term: 10 index: 14075 } } ignore_live_leader: false dest_uuid: "a10940ed9d79478faf0dac2c7b960184" is_pre_election: true
I20251028 09:10:51.112821 13516 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 817e1b13d456460eb9915890dd578911 in term 12.
I20251028 09:10:51.113034 13584 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 13 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184; no voters: 3fedf3ecbec146b6a6ba544511a16fa1
I20251028 09:10:51.113193 14015 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Leader pre-election won for term 13
I20251028 09:10:51.113250 14015 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:10:51.113274 14015 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 12 FOLLOWER]: Advancing to term 13
I20251028 09:10:51.114571 14015 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 13 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:51.114704 14015 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 13 election: Requested vote from peers a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:51.114825 13784 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "817e1b13d456460eb9915890dd578911" candidate_term: 13 candidate_status { last_received { term: 10 index: 14075 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
W20251028 09:10:51.115046 13586 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 13 election: Tablet error from VoteRequest() call to peer 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959): Illegal state: must be running to vote when last-logged opid is not known
I20251028 09:10:51.115113 13516 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "817e1b13d456460eb9915890dd578911" candidate_term: 13 candidate_status { last_received { term: 10 index: 14075 } } ignore_live_leader: false dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
I20251028 09:10:51.115199 13516 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 12 FOLLOWER]: Advancing to term 13
I20251028 09:10:51.116423 13516 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 13 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 817e1b13d456460eb9915890dd578911 in term 13.
I20251028 09:10:51.116627 13584 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [CANDIDATE]: Term 13 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184; no voters: 3fedf3ecbec146b6a6ba544511a16fa1
I20251028 09:10:51.116736 14015 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 13 FOLLOWER]: Leader election won for term 13
I20251028 09:10:51.116889 14015 raft_consensus.cc:697] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 13 LEADER]: Becoming Leader. State: Replica: 817e1b13d456460eb9915890dd578911, State: Running, Role: LEADER
I20251028 09:10:51.116983 14015 consensus_queue.cc:237] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 14073, Committed index: 14073, Last appended: 10.14075, Last appended by leader: 14075, Current term: 13, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:51.117615  9475 catalog_manager.cc:5649] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 reported cstate change: term changed from 10 to 13. New cstate: current_term: 13 leader_uuid: "817e1b13d456460eb9915890dd578911" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } health_report { overall_health: UNKNOWN } } }
I20251028 09:10:51.190099 13516 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 13 FOLLOWER]: Refusing update from remote peer 817e1b13d456460eb9915890dd578911: Log matching property violated. Preceding OpId in replica: term: 10 index: 14073. Preceding OpId from leader: term: 13 index: 14077. (index mismatch)
W20251028 09:10:51.190109 13586 consensus_peers.cc:597] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 -> Peer 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959): Couldn't send request to peer 3fedf3ecbec146b6a6ba544511a16fa1. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 1: this message will repeat every 5th retry.
I20251028 09:10:51.190557 14015 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 14076, Last known committed idx: 14073, Time since last communication: 0.000s
I20251028 09:10:51.193135 14014 mvcc.cc:204] Tried to move back new op lower bound from 7215688299272175616 to 7215688298975666176. Current Snapshot: MvccSnapshot[applied={T|T < 7215688275560243200}]
I20251028 09:10:51.194587 14026 mvcc.cc:204] Tried to move back new op lower bound from 7215688299272175616 to 7215688298975666176. Current Snapshot: MvccSnapshot[applied={T|T < 7215688275560243200}]
I20251028 09:10:51.574074 13723 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 3/4 log segments. Stats: ops{read=14006 overwritten=0 applied=14006 ignored=0} inserts{seen=700050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:51.595907 13723 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 4/4 log segments. Stats: ops{read=14073 overwritten=0 applied=14073 ignored=0} inserts{seen=703400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:51.596577 13723 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap complete.
I20251028 09:10:51.604171 13723 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent bootstrapping tablet: real 4.427s	user 3.841s	sys 0.518s
I20251028 09:10:51.604969 13723 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 12 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:51.605358 13723 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 12 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fedf3ecbec146b6a6ba544511a16fa1, State: Initialized, Role: FOLLOWER
I20251028 09:10:51.605520 13723 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14073, Last appended: 10.14073, Last appended by leader: 14073, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:51.605867 13723 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent starting tablet: real 0.002s	user 0.001s	sys 0.000s
I20251028 09:10:51.635520 13784 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 12 FOLLOWER]: Advancing to term 13
I20251028 09:10:51.637104 13784 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 13 FOLLOWER]: Refusing update from remote peer 817e1b13d456460eb9915890dd578911: Log matching property violated. Preceding OpId in replica: term: 10 index: 14073. Preceding OpId from leader: term: 10 index: 14075. (index mismatch)
I20251028 09:10:51.637908 14036 consensus_queue.cc:1050] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [LEADER]: Got LMP mismatch error from peer: Peer: permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 14076, Last known committed idx: 14073, Time since last communication: 0.000s
I20251028 09:10:51.710186 14040 mvcc.cc:204] Tried to move back new op lower bound from 7215688301103517696 to 7215688298975666176. Current Snapshot: MvccSnapshot[applied={T|T < 7215688300783955968 or (T in {7215688300794171392,7215688300796956672,7215688300803072000,7215688300806492160,7215688300809269248,7215688300785258496,7215688300816379904})}]
I20251028 09:10:52.859561 13898 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:52.865773 13628 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:10:52.876521 13765 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:52.899325 13496 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20251028 09:10:53.271376  9475 ts_manager.cc:284] Unset tserver state for 817e1b13d456460eb9915890dd578911 from MAINTENANCE_MODE
I20251028 09:10:53.272621 13562 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:53.355052  9473 ts_manager.cc:284] Unset tserver state for a10940ed9d79478faf0dac2c7b960184 from MAINTENANCE_MODE
I20251028 09:10:53.373801  9473 ts_manager.cc:284] Unset tserver state for bdeedf35d61a49bda10c70688541a0ea from MAINTENANCE_MODE
I20251028 09:10:53.389659 13964 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:53.431226  9473 ts_manager.cc:284] Unset tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 from MAINTENANCE_MODE
I20251028 09:10:53.662618 13831 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:53.672781 13697 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:53.755214  9473 ts_manager.cc:295] Set tserver state for a10940ed9d79478faf0dac2c7b960184 to MAINTENANCE_MODE
I20251028 09:10:53.993652  9473 ts_manager.cc:295] Set tserver state for bdeedf35d61a49bda10c70688541a0ea to MAINTENANCE_MODE
I20251028 09:10:54.015100  9473 ts_manager.cc:295] Set tserver state for 817e1b13d456460eb9915890dd578911 to MAINTENANCE_MODE
I20251028 09:10:54.027678  9473 ts_manager.cc:295] Set tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 to MAINTENANCE_MODE
I20251028 09:10:54.031037 13496 tablet_service.cc:1460] Tablet server a10940ed9d79478faf0dac2c7b960184 set to quiescing
I20251028 09:10:54.031116 13496 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20251028 09:10:54.276211 13562 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:54.277904 13898 tablet_service.cc:1460] Tablet server bdeedf35d61a49bda10c70688541a0ea set to quiescing
I20251028 09:10:54.277978 13898 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:54.390736 13964 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:54.419899 13628 tablet_service.cc:1460] Tablet server 817e1b13d456460eb9915890dd578911 set to quiescing
I20251028 09:10:54.420096 13628 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:10:54.422914 14083 raft_consensus.cc:993] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: : Instructing follower 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:54.423039 14083 raft_consensus.cc:1081] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 13 LEADER]: Signalling peer 3fedf3ecbec146b6a6ba544511a16fa1 to start an election
I20251028 09:10:54.423309 13783 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2"
dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
 from {username='slave'} at 127.8.22.129:40265
I20251028 09:10:54.423424 13783 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 13 FOLLOWER]: Starting forced leader election (received explicit request)
I20251028 09:10:54.423489 13783 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 13 FOLLOWER]: Advancing to term 14
I20251028 09:10:54.424049 13765 tablet_service.cc:1460] Tablet server 3fedf3ecbec146b6a6ba544511a16fa1 set to quiescing
I20251028 09:10:54.424099 13765 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:54.424408 13783 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:54.424710 13783 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 14 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:54.425793 13784 raft_consensus.cc:1240] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Rejecting Update request from peer 817e1b13d456460eb9915890dd578911 for earlier term 13. Current term is 14. Ops: [13.16519-13.16519]
I20251028 09:10:54.426621 14034 consensus_queue.cc:1059] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 }, Status: INVALID_TERM, Last received: 13.16518, Next index: 16519, Last known committed idx: 16518, Time since last communication: 0.000s
I20251028 09:10:54.426750 14034 raft_consensus.cc:3055] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 13 LEADER]: Stepping down as leader of term 13
I20251028 09:10:54.426797 14034 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 13 LEADER]: Becoming Follower/Learner. State: Replica: 817e1b13d456460eb9915890dd578911, State: Running, Role: LEADER
I20251028 09:10:54.426851 14034 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 16518, Committed index: 16518, Last appended: 13.16521, Last appended by leader: 16521, Current term: 13, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:54.426991 14034 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 13 FOLLOWER]: Advancing to term 14
I20251028 09:10:54.431155 13516 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 14 candidate_status { last_received { term: 13 index: 16518 } } ignore_live_leader: true dest_uuid: "a10940ed9d79478faf0dac2c7b960184"
I20251028 09:10:54.431264 13516 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 13 FOLLOWER]: Advancing to term 14
I20251028 09:10:54.432448 13516 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 14 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 14 because replica has last-logged OpId of term: 13 index: 16521, which is greater than that of the candidate, which has last-logged OpId of term: 13 index: 16518.
I20251028 09:10:54.432813 13635 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 14 candidate_status { last_received { term: 13 index: 16518 } } ignore_live_leader: true dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:10:54.432929 13635 raft_consensus.cc:2410] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 14 FOLLOWER]: Leader election vote request: Denying vote to candidate 3fedf3ecbec146b6a6ba544511a16fa1 for term 14 because replica has last-logged OpId of term: 13 index: 16521, which is greater than that of the candidate, which has last-logged OpId of term: 13 index: 16518.
I20251028 09:10:54.433132 13717 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 14 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1; no voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184
I20251028 09:10:54.433423 14226 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Leader election lost for term 14. Reason: could not achieve majority
W20251028 09:10:54.726579 14227 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:54.789140 14015 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: failed to trigger leader election: Illegal state: leader elections are disabled
W20251028 09:10:54.796648 14226 raft_consensus.cc:670] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: failed to trigger leader election: Illegal state: leader elections are disabled
I20251028 09:10:55.181505 13496 tablet_service.cc:1460] Tablet server a10940ed9d79478faf0dac2c7b960184 set to quiescing
I20251028 09:10:55.181574 13496 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:55.568077 13628 tablet_service.cc:1460] Tablet server 817e1b13d456460eb9915890dd578911 set to quiescing
I20251028 09:10:55.568145 13628 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:55.624048  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 13432
I20251028 09:10:55.636385  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.130:41397
--local_ip_for_outbound_sockets=127.8.22.130
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36451
--webserver_interface=127.8.22.130
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251028 09:10:55.717550 14250 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:55.717736 14250 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:55.717758 14250 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:55.719290 14250 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:55.719353 14250 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.130
I20251028 09:10:55.720942 14250 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.130:41397
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.8.22.130
--webserver_port=36451
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.14250
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.130
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:55.721181 14250 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:55.721416 14250 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:55.724329 14258 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:55.724409 14255 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:55.724411 14256 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:55.724431 14250 server_base.cc:1047] running on GCE node
I20251028 09:10:55.724807 14250 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:55.725018 14250 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:55.726152 14250 hybrid_clock.cc:648] HybridClock initialized: now 1761642655726141 us; error 26 us; skew 500 ppm
I20251028 09:10:55.727317 14250 webserver.cc:492] Webserver started at http://127.8.22.130:36451/ using document root <none> and password file <none>
I20251028 09:10:55.727515 14250 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:55.727562 14250 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:55.728753 14250 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:55.729420 14264 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:55.729590 14250 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251028 09:10:55.729660 14250 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
uuid: "a10940ed9d79478faf0dac2c7b960184"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:55.729919 14250 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:55.760367 14250 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:55.760629 14250 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:55.760739 14250 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:55.760964 14250 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:55.761579 14271 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:55.762768 14250 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:55.762836 14250 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:55.762885 14250 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:55.763702 14250 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:55.763813 14250 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:55.763824 14271 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap starting.
I20251028 09:10:55.771095 14250 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.130:41397
I20251028 09:10:55.771170 14378 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.130:41397 every 8 connection(s)
I20251028 09:10:55.771453 14250 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-1/data/info.pb
I20251028 09:10:55.781617  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 14250
I20251028 09:10:55.781756  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 13567
I20251028 09:10:55.787228 14379 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:55.787364 14379 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:55.787598 14379 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:55.788173  9473 ts_manager.cc:194] Re-registered known tserver with Master: a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:55.788764  9473 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.130:59141
W20251028 09:10:55.794898 10071 meta_cache.cc:302] tablet cd9b9f67e7d142db81a1c8be59070ef2: replica 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271) has failed: Network error: recv got EOF from 127.8.22.129:39271 (error 108)
I20251028 09:10:55.795679  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.129:39271
--local_ip_for_outbound_sockets=127.8.22.129
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=38771
--webserver_interface=127.8.22.129
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251028 09:10:55.806627 13726 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:55.807538 13726 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:55.810662 13726 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:55.821303 14271 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Log is configured to *not* fsync() on all Append() calls
W20251028 09:10:55.827234 13726 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:55.830883 13726 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:55.835001 13726 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:55.852520 10101 meta_cache.cc:1510] marking tablet server 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271) as failed
W20251028 09:10:55.855582 13726 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:55.862279 13726 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:55.863663 13726 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:55.887144 10100 meta_cache.cc:1510] marking tablet server 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271) as failed
W20251028 09:10:55.906735 13726 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:55.908437 14383 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:55.908650 14383 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:55.908721 14383 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:55.911063 14383 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:55.911142 14383 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.129
I20251028 09:10:55.913563 14383 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.129:39271
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.8.22.129
--webserver_port=38771
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.14383
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.129
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:55.913812 14383 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:55.914100 14383 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:55.915043 13726 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:55.915047 13727 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36422: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:55.917053 14392 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:55.917254 14383 server_base.cc:1047] running on GCE node
W20251028 09:10:55.917034 14394 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:55.917048 14391 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:55.917534 14383 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:55.917727 14383 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:55.918881 14383 hybrid_clock.cc:648] HybridClock initialized: now 1761642655918866 us; error 28 us; skew 500 ppm
I20251028 09:10:55.920029 14383 webserver.cc:492] Webserver started at http://127.8.22.129:38771/ using document root <none> and password file <none>
I20251028 09:10:55.920222 14383 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:55.920264 14383 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:55.921499 14383 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:55.922431 14400 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:55.922636 14383 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251028 09:10:55.922725 14383 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
uuid: "817e1b13d456460eb9915890dd578911"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:55.923043 14383 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:55.930160 10102 meta_cache.cc:1510] marking tablet server 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271) as failed
I20251028 09:10:55.931180 14383 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:55.931412 14383 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:55.931524 14383 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:55.931726 14383 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:55.932176 14407 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:55.933466 14383 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:55.933513 14383 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:55.933549 14383 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:55.934125 14383 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:55.934160 14383 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:55.934235 14407 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap starting.
I20251028 09:10:55.941347 14383 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.129:39271
I20251028 09:10:55.941397 14514 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.129:39271 every 8 connection(s)
I20251028 09:10:55.941767 14383 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-0/data/info.pb
I20251028 09:10:55.946344 14515 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:55.946449 14515 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:55.946628 14515 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:55.947134  9475 ts_manager.cc:194] Re-registered known tserver with Master: 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271)
I20251028 09:10:55.947525  9475 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.129:58575
I20251028 09:10:55.951709  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 14383
I20251028 09:10:55.951817  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 13702
I20251028 09:10:55.965734  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.132:43959
--local_ip_for_outbound_sockets=127.8.22.132
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36911
--webserver_interface=127.8.22.132
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:56.019613 14407 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Log is configured to *not* fsync() on all Append() calls
W20251028 09:10:56.050194 14519 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:56.050370 14519 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:56.050406 14519 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:56.052093 14519 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:56.052158 14519 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.132
I20251028 09:10:56.053686 14519 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.132:43959
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.8.22.132
--webserver_port=36911
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.14519
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.132
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:56.053917 14519 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:56.054185 14519 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:56.056715 14525 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:56.056917 14526 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:56.056716 14528 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:56.057150 14519 server_base.cc:1047] running on GCE node
I20251028 09:10:56.057300 14519 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:56.057492 14519 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:56.060863 14519 hybrid_clock.cc:648] HybridClock initialized: now 1761642656060837 us; error 29 us; skew 500 ppm
I20251028 09:10:56.062179 14519 webserver.cc:492] Webserver started at http://127.8.22.132:36911/ using document root <none> and password file <none>
I20251028 09:10:56.062389 14519 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:56.062435 14519 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:56.063771 14519 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251028 09:10:56.064468 14534 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:56.064649 14519 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20251028 09:10:56.064718 14519 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:56.065641 14519 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:56.076207 14519 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:56.076453 14519 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:56.076568 14519 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:56.076761 14519 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:56.078614 14541 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251028 09:10:56.079542 14519 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251028 09:10:56.079591 14519 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.003s	user 0.000s	sys 0.000s
I20251028 09:10:56.079627 14519 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251028 09:10:56.080147 14519 ts_tablet_manager.cc:616] Registered 1 tablets
I20251028 09:10:56.080190 14519 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:56.080322 14541 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap starting.
I20251028 09:10:56.087859 14519 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.132:43959
I20251028 09:10:56.088532 14519 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-3/data/info.pb
I20251028 09:10:56.091483  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 14519
I20251028 09:10:56.091596  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 13835
I20251028 09:10:56.098822  8282 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6m1lU8/build/release/bin/kudu
/tmp/dist-test-task6m1lU8/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.8.22.131:39559
--local_ip_for_outbound_sockets=127.8.22.131
--tserver_master_addrs=127.8.22.190:37123
--webserver_port=36027
--webserver_interface=127.8.22.131
--builtin_ntp_servers=127.8.22.148:35497
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251028 09:10:56.099557 14648 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.132:43959 every 8 connection(s)
I20251028 09:10:56.110167 14649 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:56.110280 14649 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:56.110520 14649 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:56.111275  9475 ts_manager.cc:194] Re-registered known tserver with Master: 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:10:56.111796  9475 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.132:42259
I20251028 09:10:56.166880 14541 log.cc:826] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Log is configured to *not* fsync() on all Append() calls
W20251028 09:10:56.235018 14651 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251028 09:10:56.235334 14651 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251028 09:10:56.235422 14651 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251028 09:10:56.237932 14651 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251028 09:10:56.238168 14651 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.8.22.131
I20251028 09:10:56.240878 14651 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.8.22.148:35497
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.8.22.131:39559
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.8.22.131
--webserver_port=36027
--tserver_master_addrs=127.8.22.190:37123
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.14651
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.8.22.131
--log_dir=/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 60f5e5267b92c39485a66121d3ce3cc7ef57b0e0
build type RELEASE
built by None at 28 Oct 2025 08:43:13 UTC on 5fd53c4cbb9d
build id 8755
I20251028 09:10:56.241463 14651 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251028 09:10:56.241856 14651 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251028 09:10:56.245198 14660 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251028 09:10:56.245944 14659 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:56.246485 14651 server_base.cc:1047] running on GCE node
W20251028 09:10:56.246692 14662 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251028 09:10:56.246973 14651 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251028 09:10:56.247202 14651 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251028 09:10:56.250479 14651 hybrid_clock.cc:648] HybridClock initialized: now 1761642656250019 us; error 474 us; skew 500 ppm
I20251028 09:10:56.252139 14651 webserver.cc:492] Webserver started at http://127.8.22.131:36027/ using document root <none> and password file <none>
I20251028 09:10:56.252384 14651 fs_manager.cc:362] Metadata directory not provided
I20251028 09:10:56.252432 14651 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251028 09:10:56.254197 14651 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.002s
I20251028 09:10:56.255306 14668 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:56.255702 14651 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.000s
I20251028 09:10:56.255780 14651 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data,/tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
uuid: "bdeedf35d61a49bda10c70688541a0ea"
format_stamp: "Formatted at 2025-10-28 09:10:15 on dist-test-slave-kqwd"
I20251028 09:10:56.256146 14651 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251028 09:10:56.295565 14651 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251028 09:10:56.295936 14651 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251028 09:10:56.296128 14651 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251028 09:10:56.296384 14651 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251028 09:10:56.296810 14651 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251028 09:10:56.296914 14651 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:56.296994 14651 ts_tablet_manager.cc:616] Registered 0 tablets
I20251028 09:10:56.297068 14651 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251028 09:10:56.304250 14651 rpc_server.cc:307] RPC server started. Bound to: 127.8.22.131:39559
I20251028 09:10:56.304752 14651 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0/minicluster-data/ts-2/data/info.pb
I20251028 09:10:56.308116 14781 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.8.22.131:39559 every 8 connection(s)
I20251028 09:10:56.310523  8282 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6m1lU8/build/release/bin/kudu as pid 14651
I20251028 09:10:56.320353 14782 heartbeater.cc:344] Connected to a master server at 127.8.22.190:37123
I20251028 09:10:56.320477 14782 heartbeater.cc:461] Registering TS with master...
I20251028 09:10:56.320752 14782 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:10:56.322257  9473 ts_manager.cc:194] Re-registered known tserver with Master: bdeedf35d61a49bda10c70688541a0ea (127.8.22.131:39559)
I20251028 09:10:56.322793  9473 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.8.22.131:59525
I20251028 09:10:56.495000 14564 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:56.507959 14430 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:56.513674 14716 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:56.531690 14313 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:10:56.789709 14379 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:56.948259 14515 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:57.112465 14271 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4620 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251028 09:10:57.112689 14649 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:57.134992 14541 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:57.323985 14782 heartbeater.cc:499] Master 127.8.22.190:37123 was elected leader, sending a full tablet report...
I20251028 09:10:57.331832 14407 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:58.003283 14541 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 2/4 log segments. Stats: ops{read=9246 overwritten=0 applied=9243 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251028 09:10:58.477097 14271 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 2/4 log segments. Stats: ops{read=9245 overwritten=0 applied=9243 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:58.697899 14407 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 2/4 log segments. Stats: ops{read=9245 overwritten=0 applied=9243 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251028 09:10:58.918605 14541 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 3/4 log segments. Stats: ops{read=13869 overwritten=0 applied=13866 ignored=0} inserts{seen=693050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251028 09:10:59.457383 14541 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap replayed 4/4 log segments. Stats: ops{read=16518 overwritten=0 applied=16518 ignored=0} inserts{seen=825600 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251028 09:10:59.457939 14541 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Bootstrap complete.
I20251028 09:10:59.464464 14541 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent bootstrapping tablet: real 3.384s	user 2.847s	sys 0.522s
I20251028 09:10:59.465631 14541 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:59.465850 14541 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3fedf3ecbec146b6a6ba544511a16fa1, State: Initialized, Role: FOLLOWER
I20251028 09:10:59.465957 14541 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 16518, Last appended: 13.16518, Last appended by leader: 16518, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:59.466230 14541 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1: Time spent starting tablet: real 0.002s	user 0.003s	sys 0.000s
W20251028 09:10:59.665330 14561 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45786: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:59.711946 14561 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45786: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:59.749174 14561 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45786: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:10:59.768165 14271 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 3/4 log segments. Stats: ops{read=13869 overwritten=0 applied=13868 ignored=0} inserts{seen=693150 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251028 09:10:59.822369 14824 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:10:59.822535 14824 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:10:59.822942 14824 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 15 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:10:59.827527 14469 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 15 candidate_status { last_received { term: 13 index: 16518 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
I20251028 09:10:59.827747 14333 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 15 candidate_status { last_received { term: 13 index: 16518 } } ignore_live_leader: false dest_uuid: "a10940ed9d79478faf0dac2c7b960184" is_pre_election: true
W20251028 09:10:59.828676 14535 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 15 pre-election: Tablet error from VoteRequest() call to peer 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271): Illegal state: must be running to vote when last-logged opid is not known
W20251028 09:10:59.828820 14536 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 15 pre-election: Tablet error from VoteRequest() call to peer a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397): Illegal state: must be running to vote when last-logged opid is not known
I20251028 09:10:59.828905 14536 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 15 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1; no voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184
I20251028 09:10:59.829012 14824 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Leader pre-election lost for term 15. Reason: could not achieve majority
I20251028 09:10:59.908480 14407 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 3/4 log segments. Stats: ops{read=13866 overwritten=0 applied=13865 ignored=0} inserts{seen=693000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
W20251028 09:10:59.932833 14561 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45786: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:10:59.981098 14561 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45786: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:11:00.021432 14561 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45786: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:11:00.183064 14824 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:11:00.183184 14824 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:11:00.183358 14824 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 15 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397)
I20251028 09:11:00.183573 14469 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 15 candidate_status { last_received { term: 13 index: 16518 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
I20251028 09:11:00.183691 14333 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" candidate_term: 15 candidate_status { last_received { term: 13 index: 16518 } } ignore_live_leader: false dest_uuid: "a10940ed9d79478faf0dac2c7b960184" is_pre_election: true
W20251028 09:11:00.183869 14535 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 15 pre-election: Tablet error from VoteRequest() call to peer 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271): Illegal state: must be running to vote when last-logged opid is not known
W20251028 09:11:00.183926 14536 leader_election.cc:343] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 15 pre-election: Tablet error from VoteRequest() call to peer a10940ed9d79478faf0dac2c7b960184 (127.8.22.130:41397): Illegal state: must be running to vote when last-logged opid is not known
I20251028 09:11:00.183987 14536 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [CANDIDATE]: Term 15 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1; no voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184
I20251028 09:11:00.184085 14824 raft_consensus.cc:2749] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Leader pre-election lost for term 15. Reason: could not achieve majority
W20251028 09:11:00.207190 14561 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45786: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:11:00.260438 14561 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45786: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
W20251028 09:11:00.298525 14561 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45786: Illegal state: replica 3fedf3ecbec146b6a6ba544511a16fa1 is not leader of this config: current role FOLLOWER
I20251028 09:11:00.310725 14271 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap replayed 4/4 log segments. Stats: ops{read=16521 overwritten=0 applied=16518 ignored=0} inserts{seen=825600 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251028 09:11:00.311285 14271 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Bootstrap complete.
I20251028 09:11:00.317433 14271 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent bootstrapping tablet: real 4.554s	user 3.995s	sys 0.527s
I20251028 09:11:00.318284 14271 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 14 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:11:00.318928 14271 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 14 FOLLOWER]: Becoming Follower/Learner. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Initialized, Role: FOLLOWER
I20251028 09:11:00.319129 14271 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 16518, Last appended: 13.16521, Last appended by leader: 16521, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:11:00.319372 14271 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184: Time spent starting tablet: real 0.002s	user 0.003s	sys 0.001s
I20251028 09:11:00.439298 14407 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap replayed 4/4 log segments. Stats: ops{read=16521 overwritten=0 applied=16518 ignored=0} inserts{seen=825600 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251028 09:11:00.439818 14407 tablet_bootstrap.cc:492] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Bootstrap complete.
I20251028 09:11:00.445885 14407 ts_tablet_manager.cc:1403] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent bootstrapping tablet: real 4.512s	user 3.918s	sys 0.562s
I20251028 09:11:00.446817 14407 raft_consensus.cc:359] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 14 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:11:00.447520 14407 raft_consensus.cc:740] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 14 FOLLOWER]: Becoming Follower/Learner. State: Replica: 817e1b13d456460eb9915890dd578911, State: Initialized, Role: FOLLOWER
I20251028 09:11:00.447638 14407 consensus_queue.cc:260] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 16518, Last appended: 13.16521, Last appended by leader: 16521, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:11:00.447875 14407 ts_tablet_manager.cc:1434] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911: Time spent starting tablet: real 0.002s	user 0.005s	sys 0.000s
I20251028 09:11:00.568039 14832 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 14 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251028 09:11:00.568187 14832 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 14 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:11:00.568514 14832 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 15 pre-election: Requested pre-vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:11:00.571624 14603 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 15 candidate_status { last_received { term: 13 index: 16521 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" is_pre_election: true
I20251028 09:11:00.571677 14469 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 15 candidate_status { last_received { term: 13 index: 16521 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911" is_pre_election: true
I20251028 09:11:00.571764 14603 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 14.
I20251028 09:11:00.571818 14469 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 14 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 14.
I20251028 09:11:00.571970 14268 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 15 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3fedf3ecbec146b6a6ba544511a16fa1, a10940ed9d79478faf0dac2c7b960184; no voters: 
I20251028 09:11:00.572118 14832 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 14 FOLLOWER]: Leader pre-election won for term 15
I20251028 09:11:00.572196 14832 raft_consensus.cc:493] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 14 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251028 09:11:00.572228 14832 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 14 FOLLOWER]: Advancing to term 15
I20251028 09:11:00.573189 14832 raft_consensus.cc:515] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 15 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:11:00.573318 14832 leader_election.cc:290] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 15 election: Requested vote from peers 817e1b13d456460eb9915890dd578911 (127.8.22.129:39271), 3fedf3ecbec146b6a6ba544511a16fa1 (127.8.22.132:43959)
I20251028 09:11:00.573518 14603 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 15 candidate_status { last_received { term: 13 index: 16521 } } ignore_live_leader: false dest_uuid: "3fedf3ecbec146b6a6ba544511a16fa1"
I20251028 09:11:00.573520 14469 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "cd9b9f67e7d142db81a1c8be59070ef2" candidate_uuid: "a10940ed9d79478faf0dac2c7b960184" candidate_term: 15 candidate_status { last_received { term: 13 index: 16521 } } ignore_live_leader: false dest_uuid: "817e1b13d456460eb9915890dd578911"
I20251028 09:11:00.573598 14603 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 14 FOLLOWER]: Advancing to term 15
I20251028 09:11:00.573598 14469 raft_consensus.cc:3060] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 14 FOLLOWER]: Advancing to term 15
I20251028 09:11:00.574661 14469 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 15 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 15.
I20251028 09:11:00.574661 14603 raft_consensus.cc:2468] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 15 FOLLOWER]: Leader election vote request: Granting yes vote for candidate a10940ed9d79478faf0dac2c7b960184 in term 15.
I20251028 09:11:00.574865 14265 leader_election.cc:304] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [CANDIDATE]: Term 15 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 817e1b13d456460eb9915890dd578911, a10940ed9d79478faf0dac2c7b960184; no voters: 
I20251028 09:11:00.574985 14832 raft_consensus.cc:2804] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 15 FOLLOWER]: Leader election won for term 15
I20251028 09:11:00.575151 14832 raft_consensus.cc:697] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [term 15 LEADER]: Becoming Leader. State: Replica: a10940ed9d79478faf0dac2c7b960184, State: Running, Role: LEADER
I20251028 09:11:00.575244 14832 consensus_queue.cc:237] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 16518, Committed index: 16518, Last appended: 13.16521, Last appended by leader: 16521, Current term: 15, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } }
I20251028 09:11:00.575938  9475 catalog_manager.cc:5649] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 reported cstate change: term changed from 13 to 15, leader changed from 817e1b13d456460eb9915890dd578911 (127.8.22.129) to a10940ed9d79478faf0dac2c7b960184 (127.8.22.130). New cstate: current_term: 15 leader_uuid: "a10940ed9d79478faf0dac2c7b960184" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a10940ed9d79478faf0dac2c7b960184" member_type: VOTER last_known_addr { host: "127.8.22.130" port: 41397 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 } health_report { overall_health: UNKNOWN } } }
I20251028 09:11:00.651822 14603 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 3fedf3ecbec146b6a6ba544511a16fa1 [term 15 FOLLOWER]: Refusing update from remote peer a10940ed9d79478faf0dac2c7b960184: Log matching property violated. Preceding OpId in replica: term: 13 index: 16518. Preceding OpId from leader: term: 15 index: 16522. (index mismatch)
I20251028 09:11:00.652135 14832 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3fedf3ecbec146b6a6ba544511a16fa1" member_type: VOTER last_known_addr { host: "127.8.22.132" port: 43959 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 16522, Last known committed idx: 16518, Time since last communication: 0.000s
I20251028 09:11:00.654525 14469 raft_consensus.cc:1275] T cd9b9f67e7d142db81a1c8be59070ef2 P 817e1b13d456460eb9915890dd578911 [term 15 FOLLOWER]: Refusing update from remote peer a10940ed9d79478faf0dac2c7b960184: Log matching property violated. Preceding OpId in replica: term: 13 index: 16521. Preceding OpId from leader: term: 15 index: 16522. (index mismatch)
I20251028 09:11:00.655113 14832 consensus_queue.cc:1048] T cd9b9f67e7d142db81a1c8be59070ef2 P a10940ed9d79478faf0dac2c7b960184 [LEADER]: Connected to new peer: Peer: permanent_uuid: "817e1b13d456460eb9915890dd578911" member_type: VOTER last_known_addr { host: "127.8.22.129" port: 39271 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 16522, Last known committed idx: 16518, Time since last communication: 0.000s
W20251028 09:11:00.657135 10101 scanner-internal.cc:458] Time spent opening tablet: real 6.010s	user 0.002s	sys 0.000s
W20251028 09:11:00.690405 10100 scanner-internal.cc:458] Time spent opening tablet: real 6.008s	user 0.001s	sys 0.001s
W20251028 09:11:00.735846 10102 scanner-internal.cc:458] Time spent opening tablet: real 6.011s	user 0.001s	sys 0.001s
I20251028 09:11:01.814623 14564 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20251028 09:11:01.827710 14313 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251028 09:11:01.840025 14430 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:11:01.846390 14716 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251028 09:11:02.263072  9475 ts_manager.cc:284] Unset tserver state for 817e1b13d456460eb9915890dd578911 from MAINTENANCE_MODE
I20251028 09:11:02.265455  9473 ts_manager.cc:284] Unset tserver state for a10940ed9d79478faf0dac2c7b960184 from MAINTENANCE_MODE
I20251028 09:11:02.288352  9473 ts_manager.cc:284] Unset tserver state for bdeedf35d61a49bda10c70688541a0ea from MAINTENANCE_MODE
I20251028 09:11:02.327795 14782 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:11:02.330613  9473 ts_manager.cc:284] Unset tserver state for 3fedf3ecbec146b6a6ba544511a16fa1 from MAINTENANCE_MODE
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/maintenance_mode-itest.cc:751: Failure
Value of: s.ok()
  Actual: true
Expected: false
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/test_util.cc:403: Failure
Failed
Timed out waiting for assertion to pass.
I20251028 09:11:02.658494 14649 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:11:02.658938 14515 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:11:02.659029 14379 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:11:03.328845 14782 heartbeater.cc:507] Master 127.8.22.190:37123 requested a full tablet report, sending...
I20251028 09:11:03.927109  8282 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20251028 09:11:03.927222  8282 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 817e1b13d456460eb9915890dd578911 and pid 14383
************************ BEGIN STACKS **************************
[New LWP 14387]
[New LWP 14388]
[New LWP 14389]
[New LWP 14390]
[New LWP 14396]
[New LWP 14397]
[New LWP 14398]
[New LWP 14401]
[New LWP 14402]
[New LWP 14403]
[New LWP 14404]
[New LWP 14405]
[New LWP 14406]
[New LWP 14408]
[New LWP 14409]
[New LWP 14410]
[New LWP 14411]
[New LWP 14412]
[New LWP 14413]
[New LWP 14414]
[New LWP 14415]
[New LWP 14416]
[New LWP 14417]
[New LWP 14418]
[New LWP 14419]
[New LWP 14420]
[New LWP 14421]
[New LWP 14422]
[New LWP 14423]
[New LWP 14424]
[New LWP 14425]
[New LWP 14426]
[New LWP 14427]
[New LWP 14428]
[New LWP 14429]
[New LWP 14430]
[New LWP 14431]
[New LWP 14432]
[New LWP 14433]
[New LWP 14434]
[New LWP 14435]
[New LWP 14436]
[New LWP 14437]
[New LWP 14438]
[New LWP 14439]
[New LWP 14440]
[New LWP 14441]
[New LWP 14442]
[New LWP 14443]
[New LWP 14444]
[New LWP 14445]
[New LWP 14446]
[New LWP 14447]
[New LWP 14448]
[New LWP 14449]
[New LWP 14450]
[New LWP 14451]
[New LWP 14452]
[New LWP 14453]
[New LWP 14454]
[New LWP 14455]
[New LWP 14456]
[New LWP 14457]
[New LWP 14458]
[New LWP 14459]
[New LWP 14460]
[New LWP 14461]
[New LWP 14462]
[New LWP 14463]
[New LWP 14464]
[New LWP 14465]
[New LWP 14466]
[New LWP 14467]
[New LWP 14468]
[New LWP 14469]
[New LWP 14470]
[New LWP 14471]
[New LWP 14472]
[New LWP 14473]
[New LWP 14474]
[New LWP 14475]
[New LWP 14476]
[New LWP 14477]
[New LWP 14478]
[New LWP 14479]
[New LWP 14480]
[New LWP 14481]
[New LWP 14482]
[New LWP 14483]
[New LWP 14484]
[New LWP 14485]
[New LWP 14486]
[New LWP 14487]
[New LWP 14488]
[New LWP 14489]
[New LWP 14490]
[New LWP 14491]
[New LWP 14492]
[New LWP 14493]
[New LWP 14494]
[New LWP 14495]
[New LWP 14496]
[New LWP 14497]
[New LWP 14498]
[New LWP 14499]
[New LWP 14500]
[New LWP 14501]
[New LWP 14502]
[New LWP 14503]
[New LWP 14504]
[New LWP 14505]
[New LWP 14506]
[New LWP 14507]
[New LWP 14508]
[New LWP 14509]
[New LWP 14510]
[New LWP 14511]
[New LWP 14512]
[New LWP 14513]
[New LWP 14514]
[New LWP 14515]
[New LWP 14516]
0x00007f9f359acd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 14383 "kudu"  0x00007f9f359acd50 in ?? ()
  2    LWP 14387 "kudu"  0x00007f9f359a8fb9 in ?? ()
  3    LWP 14388 "kudu"  0x00007f9f359a8fb9 in ?? ()
  4    LWP 14389 "kudu"  0x00007f9f359a8fb9 in ?? ()
  5    LWP 14390 "kernel-watcher-" 0x00007f9f359a8fb9 in ?? ()
  6    LWP 14396 "ntp client-1439" 0x00007f9f359ac9e2 in ?? ()
  7    LWP 14397 "file cache-evic" 0x00007f9f359a8fb9 in ?? ()
  8    LWP 14398 "sq_acceptor" 0x00007f9f33abdcb9 in ?? ()
  9    LWP 14401 "rpc reactor-144" 0x00007f9f33acaa47 in ?? ()
  10   LWP 14402 "rpc reactor-144" 0x00007f9f33acaa47 in ?? ()
  11   LWP 14403 "rpc reactor-144" 0x00007f9f33acaa47 in ?? ()
  12   LWP 14404 "rpc reactor-144" 0x00007f9f33acaa47 in ?? ()
  13   LWP 14405 "MaintenanceMgr " 0x00007f9f359a8ad3 in ?? ()
  14   LWP 14406 "txn-status-mana" 0x00007f9f359a8fb9 in ?? ()
  15   LWP 14408 "collect_and_rem" 0x00007f9f359a8fb9 in ?? ()
  16   LWP 14409 "tc-session-exp-" 0x00007f9f359a8fb9 in ?? ()
  17   LWP 14410 "rpc worker-1441" 0x00007f9f359a8ad3 in ?? ()
  18   LWP 14411 "rpc worker-1441" 0x00007f9f359a8ad3 in ?? ()
  19   LWP 14412 "rpc worker-1441" 0x00007f9f359a8ad3 in ?? ()
  20   LWP 14413 "rpc worker-1441" 0x00007f9f359a8ad3 in ?? ()
  21   LWP 14414 "rpc worker-1441" 0x00007f9f359a8ad3 in ?? ()
  22   LWP 14415 "rpc worker-1441" 0x00007f9f359a8ad3 in ?? ()
  23   LWP 14416 "rpc worker-1441" 0x00007f9f359a8ad3 in ?? ()
  24   LWP 14417 "rpc worker-1441" 0x00007f9f359a8ad3 in ?? ()
  25   LWP 14418 "rpc worker-1441" 0x00007f9f359a8ad3 in ?? ()
  26   LWP 14419 "rpc worker-1441" 0x00007f9f359a8ad3 in ?? ()
  27   LWP 14420 "rpc worker-1442" 0x00007f9f359a8ad3 in ?? ()
  28   LWP 14421 "rpc worker-1442" 0x00007f9f359a8ad3 in ?? ()
  29   LWP 14422 "rpc worker-1442" 0x00007f9f359a8ad3 in ?? ()
  30   LWP 14423 "rpc worker-1442" 0x00007f9f359a8ad3 in ?? ()
  31   LWP 14424 "rpc worker-1442" 0x00007f9f359a8ad3 in ?? ()
  32   LWP 14425 "rpc worker-1442" 0x00007f9f359a8ad3 in ?? ()
  33   LWP 14426 "rpc worker-1442" 0x00007f9f359a8ad3 in ?? ()
  34   LWP 14427 "rpc worker-1442" 0x00007f9f359a8ad3 in ?? ()
  35   LWP 14428 "rpc worker-1442" 0x00007f9f359a8ad3 in ?? ()
  36   LWP 14429 "rpc worker-1442" 0x00007f9f359a8ad3 in ?? ()
  37   LWP 14430 "rpc worker-1443" 0x00007f9f359a8ad3 in ?? ()
  38   LWP 14431 "rpc worker-1443" 0x00007f9f359a8ad3 in ?? ()
  39   LWP 14432 "rpc worker-1443" 0x00007f9f359a8ad3 in ?? ()
  40   LWP 14433 "rpc worker-1443" 0x00007f9f359a8ad3 in ?? ()
  41   LWP 14434 "rpc worker-1443" 0x00007f9f359a8ad3 in ?? ()
  42   LWP 14435 "rpc worker-1443" 0x00007f9f359a8ad3 in ?? ()
  43   LWP 14436 "rpc worker-1443" 0x00007f9f359a8ad3 in ?? ()
  44   LWP 14437 "rpc worker-1443" 0x00007f9f359a8ad3 in ?? ()
  45   LWP 14438 "rpc worker-1443" 0x00007f9f359a8ad3 in ?? ()
  46   LWP 14439 "rpc worker-1443" 0x00007f9f359a8ad3 in ?? ()
  47   LWP 14440 "rpc worker-1444" 0x00007f9f359a8ad3 in ?? ()
  48   LWP 14441 "rpc worker-1444" 0x00007f9f359a8ad3 in ?? ()
  49   LWP 14442 "rpc worker-1444" 0x00007f9f359a8ad3 in ?? ()
  50   LWP 14443 "rpc worker-1444" 0x00007f9f359a8ad3 in ?? ()
  51   LWP 14444 "rpc worker-1444" 0x00007f9f359a8ad3 in ?? ()
  52   LWP 14445 "rpc worker-1444" 0x00007f9f359a8ad3 in ?? ()
  53   LWP 14446 "rpc worker-1444" 0x00007f9f359a8ad3 in ?? ()
  54   LWP 14447 "rpc worker-1444" 0x00007f9f359a8ad3 in ?? ()
  55   LWP 14448 "rpc worker-1444" 0x00007f9f359a8ad3 in ?? ()
  56   LWP 14449 "rpc worker-1444" 0x00007f9f359a8ad3 in ?? ()
  57   LWP 14450 "rpc worker-1445" 0x00007f9f359a8ad3 in ?? ()
  58   LWP 14451 "rpc worker-1445" 0x00007f9f359a8ad3 in ?? ()
  59   LWP 14452 "rpc worker-1445" 0x00007f9f359a8ad3 in ?? ()
  60   LWP 14453 "rpc worker-1445" 0x00007f9f359a8ad3 in ?? ()
  61   LWP 14454 "rpc worker-1445" 0x00007f9f359a8ad3 in ?? ()
  62   LWP 14455 "rpc worker-1445" 0x00007f9f359a8ad3 in ?? ()
  63   LWP 14456 "rpc worker-1445" 0x00007f9f359a8ad3 in ?? ()
  64   LWP 14457 "rpc worker-1445" 0x00007f9f359a8ad3 in ?? ()
  65   LWP 14458 "rpc worker-1445" 0x00007f9f359a8ad3 in ?? ()
  66   LWP 14459 "rpc worker-1445" 0x00007f9f359a8ad3 in ?? ()
  67   LWP 14460 "rpc worker-1446" 0x00007f9f359a8ad3 in ?? ()
  68   LWP 14461 "rpc worker-1446" 0x00007f9f359a8ad3 in ?? ()
  69   LWP 14462 "rpc worker-1446" 0x00007f9f359a8ad3 in ?? ()
  70   LWP 14463 "rpc worker-1446" 0x00007f9f359a8ad3 in ?? ()
  71   LWP 14464 "rpc worker-1446" 0x00007f9f359a8ad3 in ?? ()
  72   LWP 14465 "rpc worker-1446" 0x00007f9f359a8ad3 in ?? ()
  73   LWP 14466 "rpc worker-1446" 0x00007f9f359a8ad3 in ?? ()
  74   LWP 14467 "rpc worker-1446" 0x00007f9f359a8ad3 in ?? ()
  75   LWP 14468 "rpc worker-1446" 0x00007f9f359a8ad3 in ?? ()
  76   LWP 14469 "rpc worker-1446" 0x00007f9f359a8ad3 in ?? ()
  77   LWP 14470 "rpc worker-1447" 0x00007f9f359a8ad3 in ?? ()
  78   LWP 14471 "rpc worker-1447" 0x00007f9f359a8ad3 in ?? ()
  79   LWP 14472 "rpc worker-1447" 0x00007f9f359a8ad3 in ?? ()
  80   LWP 14473 "rpc worker-1447" 0x00007f9f359a8ad3 in ?? ()
  81   LWP 14474 "rpc worker-1447" 0x00007f9f359a8ad3 in ?? ()
  82   LWP 14475 "rpc worker-1447" 0x00007f9f359a8ad3 in ?? ()
  83   LWP 14476 "rpc worker-1447" 0x00007f9f359a8ad3 in ?? ()
  84   LWP 14477 "rpc worker-1447" 0x00007f9f359a8ad3 in ?? ()
  85   LWP 14478 "rpc worker-1447" 0x00007f9f359a8ad3 in ?? ()
  86   LWP 14479 "rpc worker-1447" 0x00007f9f359a8ad3 in ?? ()
  87   LWP 14480 "rpc worker-1448" 0x00007f9f359a8ad3 in ?? ()
  88   LWP 14481 "rpc worker-1448" 0x00007f9f359a8ad3 in ?? ()
  89   LWP 14482 "rpc worker-1448" 0x00007f9f359a8ad3 in ?? ()
  90   LWP 14483 "rpc worker-1448" 0x00007f9f359a8ad3 in ?? ()
  91   LWP 14484 "rpc worker-1448" 0x00007f9f359a8ad3 in ?? ()
  92   LWP 14485 "rpc worker-1448" 0x00007f9f359a8ad3 in ?? ()
  93   LWP 14486 "rpc worker-1448" 0x00007f9f359a8ad3 in ?? ()
  94   LWP 14487 "rpc worker-1448" 0x00007f9f359a8ad3 in ?? ()
  95   LWP 14488 "rpc worker-1448" 0x00007f9f359a8ad3 in ?? ()
  96   LWP 14489 "rpc worker-1448" 0x00007f9f359a8ad3 in ?? ()
  97   LWP 14490 "rpc worker-1449" 0x00007f9f359a8ad3 in ?? ()
  98   LWP 14491 "rpc worker-1449" 0x00007f9f359a8ad3 in ?? ()
  99   LWP 14492 "rpc worker-1449" 0x00007f9f359a8ad3 in ?? ()
  100  LWP 14493 "rpc worker-1449" 0x00007f9f359a8ad3 in ?? ()
  101  LWP 14494 "rpc worker-1449" 0x00007f9f359a8ad3 in ?? ()
  102  LWP 14495 "rpc worker-1449" 0x00007f9f359a8ad3 in ?? ()
  103  LWP 14496 "rpc worker-1449" 0x00007f9f359a8ad3 in ?? ()
  104  LWP 14497 "rpc worker-1449" 0x00007f9f359a8ad3 in ?? ()
  105  LWP 14498 "rpc worker-1449" 0x00007f9f359a8ad3 in ?? ()
  106  LWP 14499 "rpc worker-1449" 0x00007f9f359a8ad3 in ?? ()
  107  LWP 14500 "rpc worker-1450" 0x00007f9f359a8ad3 in ?? ()
  108  LWP 14501 "rpc worker-1450" 0x00007f9f359a8ad3 in ?? ()
  109  LWP 14502 "rpc worker-1450" 0x00007f9f359a8ad3 in ?? ()
  110  LWP 14503 "rpc worker-1450" 0x00007f9f359a8ad3 in ?? ()
  111  LWP 14504 "rpc worker-1450" 0x00007f9f359a8ad3 in ?? ()
  112  LWP 14505 "rpc worker-1450" 0x00007f9f359a8ad3 in ?? ()
  113  LWP 14506 "rpc worker-1450" 0x00007f9f359a8ad3 in ?? ()
  114  LWP 14507 "rpc worker-1450" 0x00007f9f359a8ad3 in ?? ()
  115  LWP 14508 "rpc worker-1450" 0x00007f9f359a8ad3 in ?? ()
  116  LWP 14509 "rpc worker-1450" 0x00007f9f359a8ad3 in ?? ()
  117  LWP 14510 "diag-logger-145" 0x00007f9f359a8fb9 in ?? ()
  118  LWP 14511 "result-tracker-" 0x00007f9f359a8fb9 in ?? ()
  119  LWP 14512 "excess-log-dele" 0x00007f9f359a8fb9 in ?? ()
  120  LWP 14513 "tcmalloc-memory" 0x00007f9f359a8fb9 in ?? ()
  121  LWP 14514 "acceptor-14514" 0x00007f9f33acc0c7 in ?? ()
  122  LWP 14515 "heartbeat-14515" 0x00007f9f359a8fb9 in ?? ()
  123  LWP 14516 "maintenance_sch" 0x00007f9f359a8fb9 in ?? ()
Thread 123 (LWP 14516):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000021 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005634214bde50 in ?? ()
#5  0x00007f9eec44f470 in ?? ()
#6  0x0000000000000042 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 122 (LWP 14515):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000563421427930 in ?? ()
#5  0x00007f9eecc503f0 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 121 (LWP 14514):
#0  0x00007f9f33acc0c7 in ?? ()
#1  0x00007f9eed451020 in ?? ()
#2  0x00007f9f3562cc02 in ?? ()
#3  0x00007f9eed451020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f9eed4513e0 in ?? ()
#6  0x00007f9eed451090 in ?? ()
#7  0x00005634213e30f8 in ?? ()
#8  0x00007f9f35632699 in ?? ()
#9  0x00007f9eed451510 in ?? ()
#10 0x00007f9eed451700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007f9f359ac3a7 in ?? ()
#13 0x00007f9eed452520 in ?? ()
#14 0x00007f9eed451260 in ?? ()
#15 0x00005634214ac000 in ?? ()
#16 0x0000000000000000 in ?? ()
Thread 120 (LWP 14513):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffc14a01a60 in ?? ()
#5  0x00007f9eedc52670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 119 (LWP 14512):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 118 (LWP 14511):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056342135bb70 in ?? ()
#5  0x00007f9eeec54680 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 117 (LWP 14510):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005634216adc90 in ?? ()
#5  0x00007f9eef455550 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 116 (LWP 14509):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 115 (LWP 14508):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 114 (LWP 14507):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 113 (LWP 14506):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 112 (LWP 14505):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 111 (LWP 14504):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 110 (LWP 14503):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 109 (LWP 14502):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 108 (LWP 14501):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 107 (LWP 14500):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 106 (LWP 14499):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 105 (LWP 14498):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 104 (LWP 14497):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 103 (LWP 14496):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 102 (LWP 14495):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 101 (LWP 14494):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 100 (LWP 14493):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005634217360bc in ?? ()
#4  0x00007f9ef7c665d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9ef7c665f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005634217360a8 in ?? ()
#9  0x00007f9f359a8770 in ?? ()
#10 0x00007f9ef7c665f0 in ?? ()
#11 0x00007f9ef7c66650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 99 (LWP 14492):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 98 (LWP 14491):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000056342173603c in ?? ()
#4  0x00007f9ef8c685d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9ef8c685f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000563421736028 in ?? ()
#9  0x00007f9f359a8770 in ?? ()
#10 0x00007f9ef8c685f0 in ?? ()
#11 0x00007f9ef8c68650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 97 (LWP 14490):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 96 (LWP 14489):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 95 (LWP 14488):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 94 (LWP 14487):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 93 (LWP 14486):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 92 (LWP 14485):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 91 (LWP 14484):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 90 (LWP 14483):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 89 (LWP 14482):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 88 (LWP 14481):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 87 (LWP 14480):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 86 (LWP 14479):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 85 (LWP 14478):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 84 (LWP 14477):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 83 (LWP 14476):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 82 (LWP 14475):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 81 (LWP 14474):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 80 (LWP 14473):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 79 (LWP 14472):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 78 (LWP 14471):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 77 (LWP 14470):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 76 (LWP 14469):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000268 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005634216a6f38 in ?? ()
#4  0x00007f9f03c7e5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9f03c7e5f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 75 (LWP 14468):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x00000000000002f8 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005634216a6eb8 in ?? ()
#4  0x00007f9f0447f5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9f0447f5f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 74 (LWP 14467):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 73 (LWP 14466):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 72 (LWP 14465):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 71 (LWP 14464):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 70 (LWP 14463):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 69 (LWP 14462):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 68 (LWP 14461):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 67 (LWP 14460):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 66 (LWP 14459):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 65 (LWP 14458):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 64 (LWP 14457):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 63 (LWP 14456):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 62 (LWP 14455):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 61 (LWP 14454):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 60 (LWP 14453):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 59 (LWP 14452):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 58 (LWP 14451):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 57 (LWP 14450):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 56 (LWP 14449):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 55 (LWP 14448):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 54 (LWP 14447):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 53 (LWP 14446):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 52 (LWP 14445):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 51 (LWP 14444):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 50 (LWP 14443):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 49 (LWP 14442):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 48 (LWP 14441):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 47 (LWP 14440):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 46 (LWP 14439):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 45 (LWP 14438):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 44 (LWP 14437):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 43 (LWP 14436):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 42 (LWP 14435):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 41 (LWP 14434):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 40 (LWP 14433):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 39 (LWP 14432):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 38 (LWP 14431):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 37 (LWP 14430):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005634216dbab8 in ?? ()
#4  0x00007f9f174a55d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9f174a55f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 36 (LWP 14429):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 35 (LWP 14428):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 34 (LWP 14427):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 33 (LWP 14426):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 32 (LWP 14425):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 31 (LWP 14424):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 30 (LWP 14423):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 29 (LWP 14422):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x000000000000000e in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005634216dba38 in ?? ()
#4  0x00007f9f1b4ad5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9f1b4ad5f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 28 (LWP 14421):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 27 (LWP 14420):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 26 (LWP 14419):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 25 (LWP 14418):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 24 (LWP 14417):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 23 (LWP 14416):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 22 (LWP 14415):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x000000000000004b in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005634216db9bc in ?? ()
#4  0x00007f9f1ecb45d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9f1ecb45f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005634216db9a8 in ?? ()
#9  0x00007f9f359a8770 in ?? ()
#10 0x00007f9f1ecb45f0 in ?? ()
#11 0x00007f9f1ecb4650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 21 (LWP 14414):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 20 (LWP 14413):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 19 (LWP 14412):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 18 (LWP 14411):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 17 (LWP 14410):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 16 (LWP 14409):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 15 (LWP 14408):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005634213416c8 in ?? ()
#5  0x00007f9f224bb6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 14 (LWP 14406):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 13 (LWP 14405):
#0  0x00007f9f359a8ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 12 (LWP 14404):
#0  0x00007f9f33acaa47 in ?? ()
#1  0x00007f9f244bf680 in ?? ()
#2  0x00007f9f2edce571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x0000563421438e58 in ?? ()
#5  0x00007f9f244bf6c0 in ?? ()
#6  0x00007f9f244bf840 in ?? ()
#7  0x00005634214dcd30 in ?? ()
#8  0x00007f9f2edd025d in ?? ()
#9  0x3fb512d990438000 in ?? ()
#10 0x000056342142ac00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056342142ac00 in ?? ()
#13 0x0000000021438e58 in ?? ()
#14 0x0000563400000000 in ?? ()
#15 0x41da4020d459884e in ?? ()
#16 0x00005634214dcd30 in ?? ()
#17 0x00007f9f244bf720 in ?? ()
#18 0x00007f9f2edd4ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb512d990438000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 14403):
#0  0x00007f9f33acaa47 in ?? ()
#1  0x00007f9f24cc0680 in ?? ()
#2  0x00007f9f2edce571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x0000563421439a98 in ?? ()
#5  0x00007f9f24cc06c0 in ?? ()
#6  0x00007f9f24cc0840 in ?? ()
#7  0x00005634214dcd30 in ?? ()
#8  0x00007f9f2edd025d in ?? ()
#9  0x3fb989598e110000 in ?? ()
#10 0x000056342142a100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056342142a100 in ?? ()
#13 0x0000000021439a98 in ?? ()
#14 0x0000563400000000 in ?? ()
#15 0x41da4020d459884c in ?? ()
#16 0x00005634214dcd30 in ?? ()
#17 0x00007f9f24cc0720 in ?? ()
#18 0x00007f9f2edd4ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb989598e110000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 14402):
#0  0x00007f9f33acaa47 in ?? ()
#1  0x00007f9f254c1680 in ?? ()
#2  0x00007f9f2edce571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x0000563421439c58 in ?? ()
#5  0x00007f9f254c16c0 in ?? ()
#6  0x00007f9f254c1840 in ?? ()
#7  0x00005634214dcd30 in ?? ()
#8  0x00007f9f2edd025d in ?? ()
#9  0x3fb980e96341c000 in ?? ()
#10 0x0000563421429600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000563421429600 in ?? ()
#13 0x0000000021439c58 in ?? ()
#14 0x0000563400000000 in ?? ()
#15 0x41da4020d459884b in ?? ()
#16 0x00005634214dcd30 in ?? ()
#17 0x00007f9f254c1720 in ?? ()
#18 0x00007f9f2edd4ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb980e96341c000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 9 (LWP 14401):
#0  0x00007f9f33acaa47 in ?? ()
#1  0x00007f9f270ac680 in ?? ()
#2  0x00007f9f2edce571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x0000563421439e18 in ?? ()
#5  0x00007f9f270ac6c0 in ?? ()
#6  0x00007f9f270ac840 in ?? ()
#7  0x00005634214dcd30 in ?? ()
#8  0x00007f9f2edd025d in ?? ()
#9  0x3fb98220e835c000 in ?? ()
#10 0x000056342142a680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056342142a680 in ?? ()
#13 0x0000000021439e18 in ?? ()
#14 0x0000563400000000 in ?? ()
#15 0x41da4020d459884c in ?? ()
#16 0x00005634214dcd30 in ?? ()
#17 0x00007f9f270ac720 in ?? ()
#18 0x00007f9f2edd4ba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 14398):
#0  0x00007f9f33abdcb9 in ?? ()
#1  0x00007f9f288af840 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 7 (LWP 14397):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 6 (LWP 14396):
#0  0x00007f9f359ac9e2 in ?? ()
#1  0x000056342135bee0 in ?? ()
#2  0x00007f9f278ad4d0 in ?? ()
#3  0x00007f9f278ad450 in ?? ()
#4  0x00007f9f278ad570 in ?? ()
#5  0x00007f9f278ad790 in ?? ()
#6  0x00007f9f278ad7a0 in ?? ()
#7  0x00007f9f278ad4e0 in ?? ()
#8  0x00007f9f278ad4d0 in ?? ()
#9  0x000056342135a350 in ?? ()
#10 0x00007f9f35d97c6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 14390):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002a in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005634214e0dc8 in ?? ()
#5  0x00007f9f298b1430 in ?? ()
#6  0x0000000000000054 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 4 (LWP 14389):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000563421340848 in ?? ()
#5  0x00007f9f2a0b2790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 3 (LWP 14388):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005634213402a8 in ?? ()
#5  0x00007f9f2a8b3790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 2 (LWP 14387):
#0  0x00007f9f359a8fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000563421340188 in ?? ()
#5  0x00007f9f2b0b4790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 1 (LWP 14383):
#0  0x00007f9f359acd50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251028 09:11:04.455243  8282 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID a10940ed9d79478faf0dac2c7b960184 and pid 14250
************************ BEGIN STACKS **************************
[New LWP 14251]
[New LWP 14252]
[New LWP 14253]
[New LWP 14254]
[New LWP 14260]
[New LWP 14261]
[New LWP 14262]
[New LWP 14265]
[New LWP 14266]
[New LWP 14267]
[New LWP 14268]
[New LWP 14269]
[New LWP 14270]
[New LWP 14272]
[New LWP 14273]
[New LWP 14274]
[New LWP 14275]
[New LWP 14276]
[New LWP 14277]
[New LWP 14278]
[New LWP 14279]
[New LWP 14280]
[New LWP 14281]
[New LWP 14282]
[New LWP 14283]
[New LWP 14284]
[New LWP 14285]
[New LWP 14286]
[New LWP 14287]
[New LWP 14288]
[New LWP 14289]
[New LWP 14290]
[New LWP 14291]
[New LWP 14292]
[New LWP 14293]
[New LWP 14294]
[New LWP 14295]
[New LWP 14296]
[New LWP 14297]
[New LWP 14298]
[New LWP 14299]
[New LWP 14300]
[New LWP 14301]
[New LWP 14302]
[New LWP 14303]
[New LWP 14304]
[New LWP 14305]
[New LWP 14306]
[New LWP 14307]
[New LWP 14308]
[New LWP 14309]
[New LWP 14310]
[New LWP 14311]
[New LWP 14312]
[New LWP 14313]
[New LWP 14314]
[New LWP 14315]
[New LWP 14316]
[New LWP 14317]
[New LWP 14318]
[New LWP 14319]
[New LWP 14320]
[New LWP 14321]
[New LWP 14322]
[New LWP 14323]
[New LWP 14324]
[New LWP 14325]
[New LWP 14326]
[New LWP 14327]
[New LWP 14328]
[New LWP 14329]
[New LWP 14330]
[New LWP 14331]
[New LWP 14332]
[New LWP 14333]
[New LWP 14334]
[New LWP 14335]
[New LWP 14336]
[New LWP 14337]
[New LWP 14338]
[New LWP 14339]
[New LWP 14340]
[New LWP 14341]
[New LWP 14342]
[New LWP 14343]
[New LWP 14344]
[New LWP 14345]
[New LWP 14346]
[New LWP 14347]
[New LWP 14348]
[New LWP 14349]
[New LWP 14350]
[New LWP 14351]
[New LWP 14352]
[New LWP 14353]
[New LWP 14354]
[New LWP 14355]
[New LWP 14356]
[New LWP 14357]
[New LWP 14358]
[New LWP 14359]
[New LWP 14360]
[New LWP 14361]
[New LWP 14362]
[New LWP 14363]
[New LWP 14364]
[New LWP 14365]
[New LWP 14366]
[New LWP 14367]
[New LWP 14368]
[New LWP 14369]
[New LWP 14370]
[New LWP 14371]
[New LWP 14372]
[New LWP 14373]
[New LWP 14374]
[New LWP 14375]
[New LWP 14376]
[New LWP 14377]
[New LWP 14378]
[New LWP 14379]
[New LWP 14380]
[New LWP 14861]
0x00007f17464b4d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 14250 "kudu"  0x00007f17464b4d50 in ?? ()
  2    LWP 14251 "kudu"  0x00007f17464b0fb9 in ?? ()
  3    LWP 14252 "kudu"  0x00007f17464b0fb9 in ?? ()
  4    LWP 14253 "kudu"  0x00007f17464b0fb9 in ?? ()
  5    LWP 14254 "kernel-watcher-" 0x00007f17464b0fb9 in ?? ()
  6    LWP 14260 "ntp client-1426" 0x00007f17464b49e2 in ?? ()
  7    LWP 14261 "file cache-evic" 0x00007f17464b0fb9 in ?? ()
  8    LWP 14262 "sq_acceptor" 0x00007f17445c5cb9 in ?? ()
  9    LWP 14265 "rpc reactor-142" 0x00007f17445d2a47 in ?? ()
  10   LWP 14266 "rpc reactor-142" 0x00007f17445d2a47 in ?? ()
  11   LWP 14267 "rpc reactor-142" 0x00007f17445d2a47 in ?? ()
  12   LWP 14268 "rpc reactor-142" 0x00007f17445d2a47 in ?? ()
  13   LWP 14269 "MaintenanceMgr " 0x00007f17464b0ad3 in ?? ()
  14   LWP 14270 "txn-status-mana" 0x00007f17464b0fb9 in ?? ()
  15   LWP 14272 "collect_and_rem" 0x00007f17464b0fb9 in ?? ()
  16   LWP 14273 "tc-session-exp-" 0x00007f17464b0fb9 in ?? ()
  17   LWP 14274 "rpc worker-1427" 0x00007f17464b0ad3 in ?? ()
  18   LWP 14275 "rpc worker-1427" 0x00007f17464b0ad3 in ?? ()
  19   LWP 14276 "rpc worker-1427" 0x00007f17464b0ad3 in ?? ()
  20   LWP 14277 "rpc worker-1427" 0x00007f17464b0ad3 in ?? ()
  21   LWP 14278 "rpc worker-1427" 0x00007f17464b0ad3 in ?? ()
  22   LWP 14279 "rpc worker-1427" 0x00007f17464b0ad3 in ?? ()
  23   LWP 14280 "rpc worker-1428" 0x00007f17464b0ad3 in ?? ()
  24   LWP 14281 "rpc worker-1428" 0x00007f17464b0ad3 in ?? ()
  25   LWP 14282 "rpc worker-1428" 0x00007f17464b0ad3 in ?? ()
  26   LWP 14283 "rpc worker-1428" 0x00007f17464b0ad3 in ?? ()
  27   LWP 14284 "rpc worker-1428" 0x00007f17464b0ad3 in ?? ()
  28   LWP 14285 "rpc worker-1428" 0x00007f17464b0ad3 in ?? ()
  29   LWP 14286 "rpc worker-1428" 0x00007f17464b0ad3 in ?? ()
  30   LWP 14287 "rpc worker-1428" 0x00007f17464b0ad3 in ?? ()
  31   LWP 14288 "rpc worker-1428" 0x00007f17464b0ad3 in ?? ()
  32   LWP 14289 "rpc worker-1428" 0x00007f17464b0ad3 in ?? ()
  33   LWP 14290 "rpc worker-1429" 0x00007f17464b0ad3 in ?? ()
  34   LWP 14291 "rpc worker-1429" 0x00007f17464b0ad3 in ?? ()
  35   LWP 14292 "rpc worker-1429" 0x00007f17464b0ad3 in ?? ()
  36   LWP 14293 "rpc worker-1429" 0x00007f17464b0ad3 in ?? ()
  37   LWP 14294 "rpc worker-1429" 0x00007f17464b0ad3 in ?? ()
  38   LWP 14295 "rpc worker-1429" 0x00007f17464b0ad3 in ?? ()
  39   LWP 14296 "rpc worker-1429" 0x00007f17464b0ad3 in ?? ()
  40   LWP 14297 "rpc worker-1429" 0x00007f17464b0ad3 in ?? ()
  41   LWP 14298 "rpc worker-1429" 0x00007f17464b0ad3 in ?? ()
  42   LWP 14299 "rpc worker-1429" 0x00007f17464b0ad3 in ?? ()
  43   LWP 14300 "rpc worker-1430" 0x00007f17464b0ad3 in ?? ()
  44   LWP 14301 "rpc worker-1430" 0x00007f17464b0ad3 in ?? ()
  45   LWP 14302 "rpc worker-1430" 0x00007f17464b0ad3 in ?? ()
  46   LWP 14303 "rpc worker-1430" 0x00007f17464b0ad3 in ?? ()
  47   LWP 14304 "rpc worker-1430" 0x00007f17464b0ad3 in ?? ()
  48   LWP 14305 "rpc worker-1430" 0x00007f17464b0ad3 in ?? ()
  49   LWP 14306 "rpc worker-1430" 0x00007f17464b0ad3 in ?? ()
  50   LWP 14307 "rpc worker-1430" 0x00007f17464b0ad3 in ?? ()
  51   LWP 14308 "rpc worker-1430" 0x00007f17464b0ad3 in ?? ()
  52   LWP 14309 "rpc worker-1430" 0x00007f17464b0ad3 in ?? ()
  53   LWP 14310 "rpc worker-1431" 0x00007f17464b0ad3 in ?? ()
  54   LWP 14311 "rpc worker-1431" 0x00007f17464b0ad3 in ?? ()
  55   LWP 14312 "rpc worker-1431" 0x00007f17464b0ad3 in ?? ()
  56   LWP 14313 "rpc worker-1431" 0x00007f17464b0ad3 in ?? ()
  57   LWP 14314 "rpc worker-1431" 0x00007f17464b0ad3 in ?? ()
  58   LWP 14315 "rpc worker-1431" 0x00007f17464b0ad3 in ?? ()
  59   LWP 14316 "rpc worker-1431" 0x00007f17464b0ad3 in ?? ()
  60   LWP 14317 "rpc worker-1431" 0x00007f17464b0ad3 in ?? ()
  61   LWP 14318 "rpc worker-1431" 0x00007f17464b0ad3 in ?? ()
  62   LWP 14319 "rpc worker-1431" 0x00007f17464b0ad3 in ?? ()
  63   LWP 14320 "rpc worker-1432" 0x00007f17464b0ad3 in ?? ()
  64   LWP 14321 "rpc worker-1432" 0x00007f17464b0ad3 in ?? ()
  65   LWP 14322 "rpc worker-1432" 0x00007f17464b0ad3 in ?? ()
  66   LWP 14323 "rpc worker-1432" 0x00007f17464b0ad3 in ?? ()
  67   LWP 14324 "rpc worker-1432" 0x00007f17464b0ad3 in ?? ()
  68   LWP 14325 "rpc worker-1432" 0x00007f17464b0ad3 in ?? ()
  69   LWP 14326 "rpc worker-1432" 0x00007f17464b0ad3 in ?? ()
  70   LWP 14327 "rpc worker-1432" 0x00007f17464b0ad3 in ?? ()
  71   LWP 14328 "rpc worker-1432" 0x00007f17464b0ad3 in ?? ()
  72   LWP 14329 "rpc worker-1432" 0x00007f17464b0ad3 in ?? ()
  73   LWP 14330 "rpc worker-1433" 0x00007f17464b0ad3 in ?? ()
  74   LWP 14331 "rpc worker-1433" 0x00007f17464b0ad3 in ?? ()
  75   LWP 14332 "rpc worker-1433" 0x00007f17464b0ad3 in ?? ()
  76   LWP 14333 "rpc worker-1433" 0x00007f17464b0ad3 in ?? ()
  77   LWP 14334 "rpc worker-1433" 0x00007f17464b0ad3 in ?? ()
  78   LWP 14335 "rpc worker-1433" 0x00007f17464b0ad3 in ?? ()
  79   LWP 14336 "rpc worker-1433" 0x00007f17464b0ad3 in ?? ()
  80   LWP 14337 "rpc worker-1433" 0x00007f17464b0ad3 in ?? ()
  81   LWP 14338 "rpc worker-1433" 0x00007f17464b0ad3 in ?? ()
  82   LWP 14339 "rpc worker-1433" 0x00007f17464b0ad3 in ?? ()
  83   LWP 14340 "rpc worker-1434" 0x00007f17464b0ad3 in ?? ()
  84   LWP 14341 "rpc worker-1434" 0x00007f17464b0ad3 in ?? ()
  85   LWP 14342 "rpc worker-1434" 0x00007f17464b0ad3 in ?? ()
  86   LWP 14343 "rpc worker-1434" 0x00007f17464b0ad3 in ?? ()
  87   LWP 14344 "rpc worker-1434" 0x00007f17464b0ad3 in ?? ()
  88   LWP 14345 "rpc worker-1434" 0x00007f17464b0ad3 in ?? ()
  89   LWP 14346 "rpc worker-1434" 0x00007f17464b0ad3 in ?? ()
  90   LWP 14347 "rpc worker-1434" 0x00007f17464b0ad3 in ?? ()
  91   LWP 14348 "rpc worker-1434" 0x00007f17464b0ad3 in ?? ()
  92   LWP 14349 "rpc worker-1434" 0x00007f17464b0ad3 in ?? ()
  93   LWP 14350 "rpc worker-1435" 0x00007f17464b0ad3 in ?? ()
  94   LWP 14351 "rpc worker-1435" 0x00007f17464b0ad3 in ?? ()
  95   LWP 14352 "rpc worker-1435" 0x00007f17464b0ad3 in ?? ()
  96   LWP 14353 "rpc worker-1435" 0x00007f17464b0ad3 in ?? ()
  97   LWP 14354 "rpc worker-1435" 0x00007f17464b0ad3 in ?? ()
  98   LWP 14355 "rpc worker-1435" 0x00007f17464b0ad3 in ?? ()
  99   LWP 14356 "rpc worker-1435" 0x00007f17464b0ad3 in ?? ()
  100  LWP 14357 "rpc worker-1435" 0x00007f17464b0ad3 in ?? ()
  101  LWP 14358 "rpc worker-1435" 0x00007f17464b0ad3 in ?? ()
  102  LWP 14359 "rpc worker-1435" 0x00007f17464b0ad3 in ?? ()
  103  LWP 14360 "rpc worker-1436" 0x00007f17464b0ad3 in ?? ()
  104  LWP 14361 "rpc worker-1436" 0x00007f17464b0ad3 in ?? ()
  105  LWP 14362 "rpc worker-1436" 0x00007f17464b0ad3 in ?? ()
  106  LWP 14363 "rpc worker-1436" 0x00007f17464b0ad3 in ?? ()
  107  LWP 14364 "rpc worker-1436" 0x00007f17464b0ad3 in ?? ()
  108  LWP 14365 "rpc worker-1436" 0x00007f17464b0ad3 in ?? ()
  109  LWP 14366 "rpc worker-1436" 0x00007f17464b0ad3 in ?? ()
  110  LWP 14367 "rpc worker-1436" 0x00007f17464b0ad3 in ?? ()
  111  LWP 14368 "rpc worker-1436" 0x00007f17464b0ad3 in ?? ()
  112  LWP 14369 "rpc worker-1436" 0x00007f17464b0ad3 in ?? ()
  113  LWP 14370 "rpc worker-1437" 0x00007f17464b0ad3 in ?? ()
  114  LWP 14371 "rpc worker-1437" 0x00007f17464b0ad3 in ?? ()
  115  LWP 14372 "rpc worker-1437" 0x00007f17464b0ad3 in ?? ()
  116  LWP 14373 "rpc worker-1437" 0x00007f17464b0ad3 in ?? ()
  117  LWP 14374 "diag-logger-143" 0x00007f17464b0fb9 in ?? ()
  118  LWP 14375 "result-tracker-" 0x00007f17464b0fb9 in ?? ()
  119  LWP 14376 "excess-log-dele" 0x00007f17464b0fb9 in ?? ()
  120  LWP 14377 "tcmalloc-memory" 0x00007f17464b0fb9 in ?? ()
  121  LWP 14378 "acceptor-14378" 0x00007f17445d40c7 in ?? ()
  122  LWP 14379 "heartbeat-14379" 0x00007f17464b0fb9 in ?? ()
  123  LWP 14380 "maintenance_sch" 0x00007f17464b0fb9 in ?? ()
  124  LWP 14861 "raft [worker]-1" 0x00007f17464b0fb9 in ?? ()
Thread 124 (LWP 14861):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000165 in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x00007f16f7f4d764 in ?? ()
#5  0x00007f16f7f4d510 in ?? ()
#6  0x00000000000002cb in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007f16f7f4d530 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007f16f7f4d590 in ?? ()
#12 0x00007f17461242e1 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 123 (LWP 14380):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000024 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055969b83fe50 in ?? ()
#5  0x00007f16fcf57470 in ?? ()
#6  0x0000000000000048 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 122 (LWP 14379):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055969b7a9930 in ?? ()
#5  0x00007f16fd7583f0 in ?? ()
#6  0x0000000000000018 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 121 (LWP 14378):
#0  0x00007f17445d40c7 in ?? ()
#1  0x00007f16fdf59020 in ?? ()
#2  0x00007f1746134c02 in ?? ()
#3  0x00007f16fdf59020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f16fdf593e0 in ?? ()
#6  0x00007f16fdf59090 in ?? ()
#7  0x000055969b7650f8 in ?? ()
#8  0x00007f174613a699 in ?? ()
#9  0x00007f16fdf59510 in ?? ()
#10 0x00007f16fdf59700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007f17464b43a7 in ?? ()
#13 0x00007f16fdf5a520 in ?? ()
#14 0x00007f16fdf59260 in ?? ()
#15 0x000055969b82e000 in ?? ()
#16 0x0000000000000000 in ?? ()
Thread 120 (LWP 14377):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffc670bcac0 in ?? ()
#5  0x00007f16fe75a670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 119 (LWP 14376):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 118 (LWP 14375):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055969b6ddb70 in ?? ()
#5  0x00007f16ff75c680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 117 (LWP 14374):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055969ba39c90 in ?? ()
#5  0x00007f16fff5d550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 116 (LWP 14373):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055969ba6c73c in ?? ()
#4  0x00007f170075e5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f170075e5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055969ba6c728 in ?? ()
#9  0x00007f17464b0770 in ?? ()
#10 0x00007f170075e5f0 in ?? ()
#11 0x00007f170075e650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 115 (LWP 14372):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055969ba6c6bc in ?? ()
#4  0x00007f1700f5f5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f1700f5f5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055969ba6c6a8 in ?? ()
#9  0x00007f17464b0770 in ?? ()
#10 0x00007f1700f5f5f0 in ?? ()
#11 0x00007f1700f5f650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 114 (LWP 14371):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 113 (LWP 14370):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 112 (LWP 14369):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 111 (LWP 14368):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 110 (LWP 14367):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 109 (LWP 14366):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 108 (LWP 14365):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 107 (LWP 14364):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 106 (LWP 14363):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 105 (LWP 14362):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 104 (LWP 14361):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 103 (LWP 14360):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 102 (LWP 14359):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 101 (LWP 14358):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 100 (LWP 14357):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 99 (LWP 14356):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 98 (LWP 14355):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 97 (LWP 14354):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 96 (LWP 14353):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 95 (LWP 14352):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 94 (LWP 14351):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 93 (LWP 14350):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 92 (LWP 14349):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 91 (LWP 14348):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 90 (LWP 14347):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 89 (LWP 14346):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 88 (LWP 14345):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 87 (LWP 14344):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 86 (LWP 14343):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 85 (LWP 14342):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 84 (LWP 14341):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 83 (LWP 14340):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 82 (LWP 14339):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 81 (LWP 14338):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 80 (LWP 14337):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 79 (LWP 14336):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 78 (LWP 14335):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 77 (LWP 14334):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 76 (LWP 14333):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000004 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055969ba31138 in ?? ()
#4  0x00007f17147865d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f17147865f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 75 (LWP 14332):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 74 (LWP 14331):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 73 (LWP 14330):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 72 (LWP 14329):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 71 (LWP 14328):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 70 (LWP 14327):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 69 (LWP 14326):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 68 (LWP 14325):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 67 (LWP 14324):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 66 (LWP 14323):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 65 (LWP 14322):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 64 (LWP 14321):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 63 (LWP 14320):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 62 (LWP 14319):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 61 (LWP 14318):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 60 (LWP 14317):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 59 (LWP 14316):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 58 (LWP 14315):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 57 (LWP 14314):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 56 (LWP 14313):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055969ba30638 in ?? ()
#4  0x00007f171e79a5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f171e79a5f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 55 (LWP 14312):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 54 (LWP 14311):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 53 (LWP 14310):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 52 (LWP 14309):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 51 (LWP 14308):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 50 (LWP 14307):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 49 (LWP 14306):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 48 (LWP 14305):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 47 (LWP 14304):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 46 (LWP 14303):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 45 (LWP 14302):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 44 (LWP 14301):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 43 (LWP 14300):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 42 (LWP 14299):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 41 (LWP 14298):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 40 (LWP 14297):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 39 (LWP 14296):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 38 (LWP 14295):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 37 (LWP 14294):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 36 (LWP 14293):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000189 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055969b949b3c in ?? ()
#4  0x00007f17287ae5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f17287ae5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055969b949b28 in ?? ()
#9  0x00007f17464b0770 in ?? ()
#10 0x00007f17287ae5f0 in ?? ()
#11 0x00007f17287ae650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 35 (LWP 14292):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x00000000000002ed in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055969b949abc in ?? ()
#4  0x00007f1728faf5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f1728faf5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055969b949aa8 in ?? ()
#9  0x00007f17464b0770 in ?? ()
#10 0x00007f1728faf5f0 in ?? ()
#11 0x00007f1728faf650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 34 (LWP 14291):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x00000000000000e3 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055969b949a3c in ?? ()
#4  0x00007f17297b05d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f17297b05f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055969b949a28 in ?? ()
#9  0x00007f17464b0770 in ?? ()
#10 0x00007f17297b05f0 in ?? ()
#11 0x00007f17297b0650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 33 (LWP 14290):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 32 (LWP 14289):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 31 (LWP 14288):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 30 (LWP 14287):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 29 (LWP 14286):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 28 (LWP 14285):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 27 (LWP 14284):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 26 (LWP 14283):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 25 (LWP 14282):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 24 (LWP 14281):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 23 (LWP 14280):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 22 (LWP 14279):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 21 (LWP 14278):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 20 (LWP 14277):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 19 (LWP 14276):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 18 (LWP 14275):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 17 (LWP 14274):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 16 (LWP 14273):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 15 (LWP 14272):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055969b6c36c8 in ?? ()
#5  0x00007f1732fc36a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 14 (LWP 14270):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 13 (LWP 14269):
#0  0x00007f17464b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 12 (LWP 14268):
#0  0x00007f17445d2a47 in ?? ()
#1  0x00007f1734fc7680 in ?? ()
#2  0x00007f173f8d6571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x000055969b7bae58 in ?? ()
#5  0x00007f1734fc76c0 in ?? ()
#6  0x00007f1734fc7840 in ?? ()
#7  0x000055969b85ed30 in ?? ()
#8  0x00007f173f8d825d in ?? ()
#9  0x3fa8ef7416960000 in ?? ()
#10 0x000055969b7acc00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055969b7acc00 in ?? ()
#13 0x000000009b7bae58 in ?? ()
#14 0x0000559600000000 in ?? ()
#15 0x41da4020d459884c in ?? ()
#16 0x000055969b85ed30 in ?? ()
#17 0x00007f1734fc7720 in ?? ()
#18 0x00007f173f8dcba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa8ef7416960000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 14267):
#0  0x00007f17445d2a47 in ?? ()
#1  0x00007f17357c8680 in ?? ()
#2  0x00007f173f8d6571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x000055969b7bba98 in ?? ()
#5  0x00007f17357c86c0 in ?? ()
#6  0x00007f17357c8840 in ?? ()
#7  0x000055969b85ed30 in ?? ()
#8  0x00007f173f8d825d in ?? ()
#9  0x3fa84cf4246d8000 in ?? ()
#10 0x000055969b7ac100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055969b7ac100 in ?? ()
#13 0x000000009b7bba98 in ?? ()
#14 0x0000559600000000 in ?? ()
#15 0x41da4020d459884f in ?? ()
#16 0x000055969b85ed30 in ?? ()
#17 0x00007f17357c8720 in ?? ()
#18 0x00007f173f8dcba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa84cf4246d8000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 14266):
#0  0x00007f17445d2a47 in ?? ()
#1  0x00007f1735fc9680 in ?? ()
#2  0x00007f173f8d6571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x000055969b7bbc58 in ?? ()
#5  0x00007f1735fc96c0 in ?? ()
#6  0x00007f1735fc9840 in ?? ()
#7  0x000055969b85ed30 in ?? ()
#8  0x00007f173f8d825d in ?? ()
#9  0x3fa945572aaa8000 in ?? ()
#10 0x000055969b7ab600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055969b7ab600 in ?? ()
#13 0x000000009b7bbc58 in ?? ()
#14 0x0000559600000000 in ?? ()
#15 0x41da4020d4598852 in ?? ()
#16 0x000055969b85ed30 in ?? ()
#17 0x00007f1735fc9720 in ?? ()
#18 0x00007f173f8dcba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa945572aaa8000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 9 (LWP 14265):
#0  0x00007f17445d2a47 in ?? ()
#1  0x00007f1737bb4680 in ?? ()
#2  0x00007f173f8d6571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x000055969b7bbe18 in ?? ()
#5  0x00007f1737bb46c0 in ?? ()
#6  0x00007f1737bb4840 in ?? ()
#7  0x000055969b85ed30 in ?? ()
#8  0x00007f173f8d825d in ?? ()
#9  0x3fa7e98583ea0000 in ?? ()
#10 0x000055969b7ac680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055969b7ac680 in ?? ()
#13 0x000000009b7bbe18 in ?? ()
#14 0x0000559600000000 in ?? ()
#15 0x41da4020d459884c in ?? ()
#16 0x000055969b85ed30 in ?? ()
#17 0x00007f1737bb4720 in ?? ()
#18 0x00007f173f8dcba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 14262):
#0  0x00007f17445c5cb9 in ?? ()
#1  0x00007f17393b7840 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 7 (LWP 14261):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 6 (LWP 14260):
#0  0x00007f17464b49e2 in ?? ()
#1  0x000055969b6ddee0 in ?? ()
#2  0x00007f17383b54d0 in ?? ()
#3  0x00007f17383b5450 in ?? ()
#4  0x00007f17383b5570 in ?? ()
#5  0x00007f17383b5790 in ?? ()
#6  0x00007f17383b57a0 in ?? ()
#7  0x00007f17383b54e0 in ?? ()
#8  0x00007f17383b54d0 in ?? ()
#9  0x000055969b6dc350 in ?? ()
#10 0x00007f174689fc6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 14254):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055969b862dc8 in ?? ()
#5  0x00007f173a3b9430 in ?? ()
#6  0x000000000000005a in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 4 (LWP 14253):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055969b6c2848 in ?? ()
#5  0x00007f173abba790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 3 (LWP 14252):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055969b6c22a8 in ?? ()
#5  0x00007f173b3bb790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 2 (LWP 14251):
#0  0x00007f17464b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055969b6c2188 in ?? ()
#5  0x00007f173bbbc790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 1 (LWP 14250):
#0  0x00007f17464b4d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251028 09:11:04.979365  8282 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID bdeedf35d61a49bda10c70688541a0ea and pid 14651
************************ BEGIN STACKS **************************
[New LWP 14655]
[New LWP 14656]
[New LWP 14657]
[New LWP 14658]
[New LWP 14664]
[New LWP 14665]
[New LWP 14666]
[New LWP 14669]
[New LWP 14670]
[New LWP 14671]
[New LWP 14672]
[New LWP 14673]
[New LWP 14674]
[New LWP 14675]
[New LWP 14676]
[New LWP 14677]
[New LWP 14678]
[New LWP 14679]
[New LWP 14680]
[New LWP 14681]
[New LWP 14682]
[New LWP 14683]
[New LWP 14684]
[New LWP 14685]
[New LWP 14686]
[New LWP 14687]
[New LWP 14688]
[New LWP 14689]
[New LWP 14690]
[New LWP 14691]
[New LWP 14692]
[New LWP 14693]
[New LWP 14694]
[New LWP 14695]
[New LWP 14696]
[New LWP 14697]
[New LWP 14698]
[New LWP 14699]
[New LWP 14700]
[New LWP 14701]
[New LWP 14702]
[New LWP 14703]
[New LWP 14704]
[New LWP 14705]
[New LWP 14706]
[New LWP 14707]
[New LWP 14708]
[New LWP 14709]
[New LWP 14710]
[New LWP 14711]
[New LWP 14712]
[New LWP 14713]
[New LWP 14714]
[New LWP 14715]
[New LWP 14716]
[New LWP 14717]
[New LWP 14718]
[New LWP 14719]
[New LWP 14720]
[New LWP 14721]
[New LWP 14722]
[New LWP 14723]
[New LWP 14724]
[New LWP 14725]
[New LWP 14726]
[New LWP 14727]
[New LWP 14728]
[New LWP 14729]
[New LWP 14730]
[New LWP 14731]
[New LWP 14732]
[New LWP 14733]
[New LWP 14734]
[New LWP 14735]
[New LWP 14736]
[New LWP 14737]
[New LWP 14738]
[New LWP 14739]
[New LWP 14740]
[New LWP 14741]
[New LWP 14742]
[New LWP 14743]
[New LWP 14744]
[New LWP 14745]
[New LWP 14746]
[New LWP 14747]
[New LWP 14748]
[New LWP 14749]
[New LWP 14750]
[New LWP 14751]
[New LWP 14752]
[New LWP 14753]
[New LWP 14754]
[New LWP 14755]
[New LWP 14756]
[New LWP 14757]
[New LWP 14758]
[New LWP 14759]
[New LWP 14760]
[New LWP 14761]
[New LWP 14762]
[New LWP 14763]
[New LWP 14764]
[New LWP 14765]
[New LWP 14766]
[New LWP 14767]
[New LWP 14768]
[New LWP 14769]
[New LWP 14770]
[New LWP 14771]
[New LWP 14772]
[New LWP 14773]
[New LWP 14774]
[New LWP 14775]
[New LWP 14776]
[New LWP 14777]
[New LWP 14778]
[New LWP 14779]
[New LWP 14780]
[New LWP 14781]
[New LWP 14782]
[New LWP 14783]
0x00007f042c439d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 14651 "kudu"  0x00007f042c439d50 in ?? ()
  2    LWP 14655 "kudu"  0x00007f042c435fb9 in ?? ()
  3    LWP 14656 "kudu"  0x00007f042c435fb9 in ?? ()
  4    LWP 14657 "kudu"  0x00007f042c435fb9 in ?? ()
  5    LWP 14658 "kernel-watcher-" 0x00007f042c435fb9 in ?? ()
  6    LWP 14664 "ntp client-1466" 0x00007f042c4399e2 in ?? ()
  7    LWP 14665 "file cache-evic" 0x00007f042c435fb9 in ?? ()
  8    LWP 14666 "sq_acceptor" 0x00007f042a54acb9 in ?? ()
  9    LWP 14669 "rpc reactor-146" 0x00007f042a557a47 in ?? ()
  10   LWP 14670 "rpc reactor-146" 0x00007f042a557a47 in ?? ()
  11   LWP 14671 "rpc reactor-146" 0x00007f042a557a47 in ?? ()
  12   LWP 14672 "rpc reactor-146" 0x00007f042a557a47 in ?? ()
  13   LWP 14673 "MaintenanceMgr " 0x00007f042c435ad3 in ?? ()
  14   LWP 14674 "txn-status-mana" 0x00007f042c435fb9 in ?? ()
  15   LWP 14675 "collect_and_rem" 0x00007f042c435fb9 in ?? ()
  16   LWP 14676 "tc-session-exp-" 0x00007f042c435fb9 in ?? ()
  17   LWP 14677 "rpc worker-1467" 0x00007f042c435ad3 in ?? ()
  18   LWP 14678 "rpc worker-1467" 0x00007f042c435ad3 in ?? ()
  19   LWP 14679 "rpc worker-1467" 0x00007f042c435ad3 in ?? ()
  20   LWP 14680 "rpc worker-1468" 0x00007f042c435ad3 in ?? ()
  21   LWP 14681 "rpc worker-1468" 0x00007f042c435ad3 in ?? ()
  22   LWP 14682 "rpc worker-1468" 0x00007f042c435ad3 in ?? ()
  23   LWP 14683 "rpc worker-1468" 0x00007f042c435ad3 in ?? ()
  24   LWP 14684 "rpc worker-1468" 0x00007f042c435ad3 in ?? ()
  25   LWP 14685 "rpc worker-1468" 0x00007f042c435ad3 in ?? ()
  26   LWP 14686 "rpc worker-1468" 0x00007f042c435ad3 in ?? ()
  27   LWP 14687 "rpc worker-1468" 0x00007f042c435ad3 in ?? ()
  28   LWP 14688 "rpc worker-1468" 0x00007f042c435ad3 in ?? ()
  29   LWP 14689 "rpc worker-1468" 0x00007f042c435ad3 in ?? ()
  30   LWP 14690 "rpc worker-1469" 0x00007f042c435ad3 in ?? ()
  31   LWP 14691 "rpc worker-1469" 0x00007f042c435ad3 in ?? ()
  32   LWP 14692 "rpc worker-1469" 0x00007f042c435ad3 in ?? ()
  33   LWP 14693 "rpc worker-1469" 0x00007f042c435ad3 in ?? ()
  34   LWP 14694 "rpc worker-1469" 0x00007f042c435ad3 in ?? ()
  35   LWP 14695 "rpc worker-1469" 0x00007f042c435ad3 in ?? ()
  36   LWP 14696 "rpc worker-1469" 0x00007f042c435ad3 in ?? ()
  37   LWP 14697 "rpc worker-1469" 0x00007f042c435ad3 in ?? ()
  38   LWP 14698 "rpc worker-1469" 0x00007f042c435ad3 in ?? ()
  39   LWP 14699 "rpc worker-1469" 0x00007f042c435ad3 in ?? ()
  40   LWP 14700 "rpc worker-1470" 0x00007f042c435ad3 in ?? ()
  41   LWP 14701 "rpc worker-1470" 0x00007f042c435ad3 in ?? ()
  42   LWP 14702 "rpc worker-1470" 0x00007f042c435ad3 in ?? ()
  43   LWP 14703 "rpc worker-1470" 0x00007f042c435ad3 in ?? ()
  44   LWP 14704 "rpc worker-1470" 0x00007f042c435ad3 in ?? ()
  45   LWP 14705 "rpc worker-1470" 0x00007f042c435ad3 in ?? ()
  46   LWP 14706 "rpc worker-1470" 0x00007f042c435ad3 in ?? ()
  47   LWP 14707 "rpc worker-1470" 0x00007f042c435ad3 in ?? ()
  48   LWP 14708 "rpc worker-1470" 0x00007f042c435ad3 in ?? ()
  49   LWP 14709 "rpc worker-1470" 0x00007f042c435ad3 in ?? ()
  50   LWP 14710 "rpc worker-1471" 0x00007f042c435ad3 in ?? ()
  51   LWP 14711 "rpc worker-1471" 0x00007f042c435ad3 in ?? ()
  52   LWP 14712 "rpc worker-1471" 0x00007f042c435ad3 in ?? ()
  53   LWP 14713 "rpc worker-1471" 0x00007f042c435ad3 in ?? ()
  54   LWP 14714 "rpc worker-1471" 0x00007f042c435ad3 in ?? ()
  55   LWP 14715 "rpc worker-1471" 0x00007f042c435ad3 in ?? ()
  56   LWP 14716 "rpc worker-1471" 0x00007f042c435ad3 in ?? ()
  57   LWP 14717 "rpc worker-1471" 0x00007f042c435ad3 in ?? ()
  58   LWP 14718 "rpc worker-1471" 0x00007f042c435ad3 in ?? ()
  59   LWP 14719 "rpc worker-1471" 0x00007f042c435ad3 in ?? ()
  60   LWP 14720 "rpc worker-1472" 0x00007f042c435ad3 in ?? ()
  61   LWP 14721 "rpc worker-1472" 0x00007f042c435ad3 in ?? ()
  62   LWP 14722 "rpc worker-1472" 0x00007f042c435ad3 in ?? ()
  63   LWP 14723 "rpc worker-1472" 0x00007f042c435ad3 in ?? ()
  64   LWP 14724 "rpc worker-1472" 0x00007f042c435ad3 in ?? ()
  65   LWP 14725 "rpc worker-1472" 0x00007f042c435ad3 in ?? ()
  66   LWP 14726 "rpc worker-1472" 0x00007f042c435ad3 in ?? ()
  67   LWP 14727 "rpc worker-1472" 0x00007f042c435ad3 in ?? ()
  68   LWP 14728 "rpc worker-1472" 0x00007f042c435ad3 in ?? ()
  69   LWP 14729 "rpc worker-1472" 0x00007f042c435ad3 in ?? ()
  70   LWP 14730 "rpc worker-1473" 0x00007f042c435ad3 in ?? ()
  71   LWP 14731 "rpc worker-1473" 0x00007f042c435ad3 in ?? ()
  72   LWP 14732 "rpc worker-1473" 0x00007f042c435ad3 in ?? ()
  73   LWP 14733 "rpc worker-1473" 0x00007f042c435ad3 in ?? ()
  74   LWP 14734 "rpc worker-1473" 0x00007f042c435ad3 in ?? ()
  75   LWP 14735 "rpc worker-1473" 0x00007f042c435ad3 in ?? ()
  76   LWP 14736 "rpc worker-1473" 0x00007f042c435ad3 in ?? ()
  77   LWP 14737 "rpc worker-1473" 0x00007f042c435ad3 in ?? ()
  78   LWP 14738 "rpc worker-1473" 0x00007f042c435ad3 in ?? ()
  79   LWP 14739 "rpc worker-1473" 0x00007f042c435ad3 in ?? ()
  80   LWP 14740 "rpc worker-1474" 0x00007f042c435ad3 in ?? ()
  81   LWP 14741 "rpc worker-1474" 0x00007f042c435ad3 in ?? ()
  82   LWP 14742 "rpc worker-1474" 0x00007f042c435ad3 in ?? ()
  83   LWP 14743 "rpc worker-1474" 0x00007f042c435ad3 in ?? ()
  84   LWP 14744 "rpc worker-1474" 0x00007f042c435ad3 in ?? ()
  85   LWP 14745 "rpc worker-1474" 0x00007f042c435ad3 in ?? ()
  86   LWP 14746 "rpc worker-1474" 0x00007f042c435ad3 in ?? ()
  87   LWP 14747 "rpc worker-1474" 0x00007f042c435ad3 in ?? ()
  88   LWP 14748 "rpc worker-1474" 0x00007f042c435ad3 in ?? ()
  89   LWP 14749 "rpc worker-1474" 0x00007f042c435ad3 in ?? ()
  90   LWP 14750 "rpc worker-1475" 0x00007f042c435ad3 in ?? ()
  91   LWP 14751 "rpc worker-1475" 0x00007f042c435ad3 in ?? ()
  92   LWP 14752 "rpc worker-1475" 0x00007f042c435ad3 in ?? ()
  93   LWP 14753 "rpc worker-1475" 0x00007f042c435ad3 in ?? ()
  94   LWP 14754 "rpc worker-1475" 0x00007f042c435ad3 in ?? ()
  95   LWP 14755 "rpc worker-1475" 0x00007f042c435ad3 in ?? ()
  96   LWP 14756 "rpc worker-1475" 0x00007f042c435ad3 in ?? ()
  97   LWP 14757 "rpc worker-1475" 0x00007f042c435ad3 in ?? ()
  98   LWP 14758 "rpc worker-1475" 0x00007f042c435ad3 in ?? ()
  99   LWP 14759 "rpc worker-1475" 0x00007f042c435ad3 in ?? ()
  100  LWP 14760 "rpc worker-1476" 0x00007f042c435ad3 in ?? ()
  101  LWP 14761 "rpc worker-1476" 0x00007f042c435ad3 in ?? ()
  102  LWP 14762 "rpc worker-1476" 0x00007f042c435ad3 in ?? ()
  103  LWP 14763 "rpc worker-1476" 0x00007f042c435ad3 in ?? ()
  104  LWP 14764 "rpc worker-1476" 0x00007f042c435ad3 in ?? ()
  105  LWP 14765 "rpc worker-1476" 0x00007f042c435ad3 in ?? ()
  106  LWP 14766 "rpc worker-1476" 0x00007f042c435ad3 in ?? ()
  107  LWP 14767 "rpc worker-1476" 0x00007f042c435ad3 in ?? ()
  108  LWP 14768 "rpc worker-1476" 0x00007f042c435ad3 in ?? ()
  109  LWP 14769 "rpc worker-1476" 0x00007f042c435ad3 in ?? ()
  110  LWP 14770 "rpc worker-1477" 0x00007f042c435ad3 in ?? ()
  111  LWP 14771 "rpc worker-1477" 0x00007f042c435ad3 in ?? ()
  112  LWP 14772 "rpc worker-1477" 0x00007f042c435ad3 in ?? ()
  113  LWP 14773 "rpc worker-1477" 0x00007f042c435ad3 in ?? ()
  114  LWP 14774 "rpc worker-1477" 0x00007f042c435ad3 in ?? ()
  115  LWP 14775 "rpc worker-1477" 0x00007f042c435ad3 in ?? ()
  116  LWP 14776 "rpc worker-1477" 0x00007f042c435ad3 in ?? ()
  117  LWP 14777 "diag-logger-147" 0x00007f042c435fb9 in ?? ()
  118  LWP 14778 "result-tracker-" 0x00007f042c435fb9 in ?? ()
  119  LWP 14779 "excess-log-dele" 0x00007f042c435fb9 in ?? ()
  120  LWP 14780 "tcmalloc-memory" 0x00007f042c435fb9 in ?? ()
  121  LWP 14781 "acceptor-14781" 0x00007f042a5590c7 in ?? ()
  122  LWP 14782 "heartbeat-14782" 0x00007f042c435fb9 in ?? ()
  123  LWP 14783 "maintenance_sch" 0x00007f042c435fb9 in ?? ()
Thread 123 (LWP 14783):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000024 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a4c59a7e50 in ?? ()
#5  0x00007f03e36dd470 in ?? ()
#6  0x0000000000000048 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 122 (LWP 14782):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a4c5911930 in ?? ()
#5  0x00007f03e3ede3f0 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 121 (LWP 14781):
#0  0x00007f042a5590c7 in ?? ()
#1  0x00007f03e46df020 in ?? ()
#2  0x00007f042c0b9c02 in ?? ()
#3  0x00007f03e46df020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f03e46df3e0 in ?? ()
#6  0x00007f03e46df090 in ?? ()
#7  0x000055a4c58cd0f8 in ?? ()
#8  0x00007f042c0bf699 in ?? ()
#9  0x00007f03e46df510 in ?? ()
#10 0x00007f03e46df700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007f042c4393a7 in ?? ()
#13 0x00007f03e46e0520 in ?? ()
#14 0x00007f03e46df260 in ?? ()
#15 0x000055a4c5996000 in ?? ()
#16 0x0000000000000000 in ?? ()
Thread 120 (LWP 14780):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffe3b0b6b20 in ?? ()
#5  0x00007f03e4ee0670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 119 (LWP 14779):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 118 (LWP 14778):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a4c5845b70 in ?? ()
#5  0x00007f03e5ee2680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 117 (LWP 14777):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a4c5b17a90 in ?? ()
#5  0x00007f03e66e3550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 116 (LWP 14776):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 115 (LWP 14775):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 114 (LWP 14774):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 113 (LWP 14773):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 112 (LWP 14772):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 111 (LWP 14771):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 110 (LWP 14770):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 109 (LWP 14769):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 108 (LWP 14768):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 107 (LWP 14767):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 106 (LWP 14766):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 105 (LWP 14765):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 104 (LWP 14764):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 103 (LWP 14763):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 102 (LWP 14762):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a4c5b4d6bc in ?? ()
#4  0x00007f03edef25d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f03edef25f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a4c5b4d6a8 in ?? ()
#9  0x00007f042c435770 in ?? ()
#10 0x00007f03edef25f0 in ?? ()
#11 0x00007f03edef2650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 101 (LWP 14761):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 100 (LWP 14760):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 99 (LWP 14759):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 98 (LWP 14758):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 97 (LWP 14757):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a4c5b4d63c in ?? ()
#4  0x00007f03f06f75d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f03f06f75f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a4c5b4d628 in ?? ()
#9  0x00007f042c435770 in ?? ()
#10 0x00007f03f06f75f0 in ?? ()
#11 0x00007f03f06f7650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 96 (LWP 14756):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 95 (LWP 14755):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 94 (LWP 14754):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 93 (LWP 14753):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 92 (LWP 14752):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 91 (LWP 14751):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 90 (LWP 14750):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 89 (LWP 14749):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 88 (LWP 14748):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 87 (LWP 14747):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 86 (LWP 14746):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 85 (LWP 14745):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 84 (LWP 14744):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 83 (LWP 14743):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 82 (LWP 14742):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 81 (LWP 14741):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 80 (LWP 14740):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 79 (LWP 14739):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 78 (LWP 14738):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 77 (LWP 14737):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 76 (LWP 14736):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a4c5b4c9b8 in ?? ()
#4  0x00007f03faf0c5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f03faf0c5f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 75 (LWP 14735):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 74 (LWP 14734):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 73 (LWP 14733):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 72 (LWP 14732):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 71 (LWP 14731):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 70 (LWP 14730):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 69 (LWP 14729):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 68 (LWP 14728):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 67 (LWP 14727):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 66 (LWP 14726):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 65 (LWP 14725):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 64 (LWP 14724):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 63 (LWP 14723):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 62 (LWP 14722):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 61 (LWP 14721):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 60 (LWP 14720):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 59 (LWP 14719):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 58 (LWP 14718):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 57 (LWP 14717):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 56 (LWP 14716):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a4c5b49eb8 in ?? ()
#4  0x00007f0404f205d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f0404f205f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 55 (LWP 14715):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 54 (LWP 14714):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 53 (LWP 14713):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 52 (LWP 14712):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 51 (LWP 14711):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 50 (LWP 14710):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 49 (LWP 14709):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 48 (LWP 14708):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 47 (LWP 14707):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 46 (LWP 14706):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 45 (LWP 14705):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 44 (LWP 14704):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 43 (LWP 14703):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 42 (LWP 14702):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 41 (LWP 14701):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 40 (LWP 14700):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 39 (LWP 14699):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 38 (LWP 14698):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 37 (LWP 14697):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 36 (LWP 14696):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a4c5b493b8 in ?? ()
#4  0x00007f040ef345d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f040ef345f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 35 (LWP 14695):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 34 (LWP 14694):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 33 (LWP 14693):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 32 (LWP 14692):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 31 (LWP 14691):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 30 (LWP 14690):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 29 (LWP 14689):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 28 (LWP 14688):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 27 (LWP 14687):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 26 (LWP 14686):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 25 (LWP 14685):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 24 (LWP 14684):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 23 (LWP 14683):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 22 (LWP 14682):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 21 (LWP 14681):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 20 (LWP 14680):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 19 (LWP 14679):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 18 (LWP 14678):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 17 (LWP 14677):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 16 (LWP 14676):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 15 (LWP 14675):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a4c582b6c8 in ?? ()
#5  0x00007f04197496a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 14 (LWP 14674):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 13 (LWP 14673):
#0  0x00007f042c435ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 12 (LWP 14672):
#0  0x00007f042a557a47 in ?? ()
#1  0x00007f041a74b680 in ?? ()
#2  0x00007f042585b571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x000055a4c5922e58 in ?? ()
#5  0x00007f041a74b6c0 in ?? ()
#6  0x00007f041a74b840 in ?? ()
#7  0x000055a4c59c6d30 in ?? ()
#8  0x00007f042585d25d in ?? ()
#9  0x3fb95b3a2728c000 in ?? ()
#10 0x000055a4c5914c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a4c5914c00 in ?? ()
#13 0x00000000c5922e58 in ?? ()
#14 0x000055a400000000 in ?? ()
#15 0x41da4020d459884c in ?? ()
#16 0x000055a4c59c6d30 in ?? ()
#17 0x00007f041a74b720 in ?? ()
#18 0x00007f0425861ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb95b3a2728c000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 14671):
#0  0x00007f042a557a47 in ?? ()
#1  0x00007f041af4c680 in ?? ()
#2  0x00007f042585b571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x000055a4c5923a98 in ?? ()
#5  0x00007f041af4c6c0 in ?? ()
#6  0x00007f041af4c840 in ?? ()
#7  0x000055a4c59c6d30 in ?? ()
#8  0x00007f042585d25d in ?? ()
#9  0x3fb97671ea528000 in ?? ()
#10 0x000055a4c5914100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a4c5914100 in ?? ()
#13 0x00000000c5923a98 in ?? ()
#14 0x000055a400000000 in ?? ()
#15 0x41da4020d459884b in ?? ()
#16 0x000055a4c59c6d30 in ?? ()
#17 0x00007f041af4c720 in ?? ()
#18 0x00007f0425861ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb97671ea528000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 14670):
#0  0x00007f042a557a47 in ?? ()
#1  0x00007f041b74d680 in ?? ()
#2  0x00007f042585b571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x000055a4c5923c58 in ?? ()
#5  0x00007f041b74d6c0 in ?? ()
#6  0x00007f041b74d840 in ?? ()
#7  0x000055a4c59c6d30 in ?? ()
#8  0x00007f042585d25d in ?? ()
#9  0x3fb97c77f9f24000 in ?? ()
#10 0x000055a4c5914680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a4c5914680 in ?? ()
#13 0x00000000c5923c58 in ?? ()
#14 0x000055a400000000 in ?? ()
#15 0x41da4020d459884c in ?? ()
#16 0x000055a4c59c6d30 in ?? ()
#17 0x00007f041b74d720 in ?? ()
#18 0x00007f0425861ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb97c77f9f24000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 9 (LWP 14669):
#0  0x00007f042a557a47 in ?? ()
#1  0x00007f041bf4e680 in ?? ()
#2  0x00007f042585b571 in ?? ()
#3  0x00000000000002e2 in ?? ()
#4  0x000055a4c5923e18 in ?? ()
#5  0x00007f041bf4e6c0 in ?? ()
#6  0x00007f041bf4e840 in ?? ()
#7  0x000055a4c59c6d30 in ?? ()
#8  0x00007f042585d25d in ?? ()
#9  0x3fb969269706c000 in ?? ()
#10 0x000055a4c5913600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a4c5913600 in ?? ()
#13 0x00000000c5923e18 in ?? ()
#14 0x000055a400000000 in ?? ()
#15 0x41da4020d459884f in ?? ()
#16 0x000055a4c59c6d30 in ?? ()
#17 0x00007f041bf4e720 in ?? ()
#18 0x00007f0425861ba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 14666):
#0  0x00007f042a54acb9 in ?? ()
#1  0x00007f041f33c840 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 7 (LWP 14665):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 6 (LWP 14664):
#0  0x00007f042c4399e2 in ?? ()
#1  0x000055a4c5845ee0 in ?? ()
#2  0x00007f041e33a4d0 in ?? ()
#3  0x00007f041e33a450 in ?? ()
#4  0x00007f041e33a570 in ?? ()
#5  0x00007f041e33a790 in ?? ()
#6  0x00007f041e33a7a0 in ?? ()
#7  0x00007f041e33a4e0 in ?? ()
#8  0x00007f041e33a4d0 in ?? ()
#9  0x000055a4c5844350 in ?? ()
#10 0x00007f042c824c6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 14658):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a4c59cadc8 in ?? ()
#5  0x00007f042033e430 in ?? ()
#6  0x000000000000005a in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 4 (LWP 14657):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a4c582a848 in ?? ()
#5  0x00007f0420b3f790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 3 (LWP 14656):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a4c582a2a8 in ?? ()
#5  0x00007f0421340790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 2 (LWP 14655):
#0  0x00007f042c435fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a4c582a188 in ?? ()
#5  0x00007f0421b41790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 1 (LWP 14651):
#0  0x00007f042c439d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251028 09:11:05.499285  8282 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 3 with UUID 3fedf3ecbec146b6a6ba544511a16fa1 and pid 14519
************************ BEGIN STACKS **************************
[New LWP 14521]
[New LWP 14522]
[New LWP 14523]
[New LWP 14524]
[New LWP 14530]
[New LWP 14531]
[New LWP 14532]
[New LWP 14535]
[New LWP 14536]
[New LWP 14537]
[New LWP 14538]
[New LWP 14539]
[New LWP 14540]
[New LWP 14542]
[New LWP 14543]
[New LWP 14544]
[New LWP 14545]
[New LWP 14546]
[New LWP 14547]
[New LWP 14548]
[New LWP 14549]
[New LWP 14550]
[New LWP 14551]
[New LWP 14552]
[New LWP 14553]
[New LWP 14554]
[New LWP 14555]
[New LWP 14556]
[New LWP 14557]
[New LWP 14558]
[New LWP 14559]
[New LWP 14560]
[New LWP 14561]
[New LWP 14562]
[New LWP 14563]
[New LWP 14564]
[New LWP 14565]
[New LWP 14566]
[New LWP 14567]
[New LWP 14568]
[New LWP 14569]
[New LWP 14570]
[New LWP 14571]
[New LWP 14572]
[New LWP 14573]
[New LWP 14574]
[New LWP 14575]
[New LWP 14576]
[New LWP 14577]
[New LWP 14578]
[New LWP 14579]
[New LWP 14580]
[New LWP 14581]
[New LWP 14582]
[New LWP 14583]
[New LWP 14584]
[New LWP 14585]
[New LWP 14586]
[New LWP 14587]
[New LWP 14588]
[New LWP 14589]
[New LWP 14590]
[New LWP 14591]
[New LWP 14592]
[New LWP 14593]
[New LWP 14594]
[New LWP 14595]
[New LWP 14596]
[New LWP 14597]
[New LWP 14598]
[New LWP 14599]
[New LWP 14600]
[New LWP 14601]
[New LWP 14602]
[New LWP 14603]
[New LWP 14604]
[New LWP 14605]
[New LWP 14606]
[New LWP 14607]
[New LWP 14608]
[New LWP 14609]
[New LWP 14610]
[New LWP 14611]
[New LWP 14612]
[New LWP 14613]
[New LWP 14614]
[New LWP 14615]
[New LWP 14616]
[New LWP 14617]
[New LWP 14618]
[New LWP 14619]
[New LWP 14620]
[New LWP 14621]
[New LWP 14622]
[New LWP 14623]
[New LWP 14624]
[New LWP 14625]
[New LWP 14626]
[New LWP 14627]
[New LWP 14628]
[New LWP 14629]
[New LWP 14630]
[New LWP 14631]
[New LWP 14632]
[New LWP 14633]
[New LWP 14634]
[New LWP 14635]
[New LWP 14636]
[New LWP 14637]
[New LWP 14638]
[New LWP 14639]
[New LWP 14640]
[New LWP 14641]
[New LWP 14642]
[New LWP 14643]
[New LWP 14644]
[New LWP 14645]
[New LWP 14646]
[New LWP 14647]
[New LWP 14648]
[New LWP 14649]
[New LWP 14650]
0x00007fe4c427dd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 14519 "kudu"  0x00007fe4c427dd50 in ?? ()
  2    LWP 14521 "kudu"  0x00007fe4c4279fb9 in ?? ()
  3    LWP 14522 "kudu"  0x00007fe4c4279fb9 in ?? ()
  4    LWP 14523 "kudu"  0x00007fe4c4279fb9 in ?? ()
  5    LWP 14524 "kernel-watcher-" 0x00007fe4c4279fb9 in ?? ()
  6    LWP 14530 "ntp client-1453" 0x00007fe4c427d9e2 in ?? ()
  7    LWP 14531 "file cache-evic" 0x00007fe4c4279fb9 in ?? ()
  8    LWP 14532 "sq_acceptor" 0x00007fe4c238ecb9 in ?? ()
  9    LWP 14535 "rpc reactor-145" 0x00007fe4c239ba47 in ?? ()
  10   LWP 14536 "rpc reactor-145" 0x00007fe4c239ba47 in ?? ()
  11   LWP 14537 "rpc reactor-145" 0x00007fe4c239ba47 in ?? ()
  12   LWP 14538 "rpc reactor-145" 0x00007fe4c239ba47 in ?? ()
  13   LWP 14539 "MaintenanceMgr " 0x00007fe4c4279ad3 in ?? ()
  14   LWP 14540 "txn-status-mana" 0x00007fe4c4279fb9 in ?? ()
  15   LWP 14542 "collect_and_rem" 0x00007fe4c4279fb9 in ?? ()
  16   LWP 14543 "tc-session-exp-" 0x00007fe4c4279fb9 in ?? ()
  17   LWP 14544 "rpc worker-1454" 0x00007fe4c4279ad3 in ?? ()
  18   LWP 14545 "rpc worker-1454" 0x00007fe4c4279ad3 in ?? ()
  19   LWP 14546 "rpc worker-1454" 0x00007fe4c4279ad3 in ?? ()
  20   LWP 14547 "rpc worker-1454" 0x00007fe4c4279ad3 in ?? ()
  21   LWP 14548 "rpc worker-1454" 0x00007fe4c4279ad3 in ?? ()
  22   LWP 14549 "rpc worker-1454" 0x00007fe4c4279ad3 in ?? ()
  23   LWP 14550 "rpc worker-1455" 0x00007fe4c4279ad3 in ?? ()
  24   LWP 14551 "rpc worker-1455" 0x00007fe4c4279ad3 in ?? ()
  25   LWP 14552 "rpc worker-1455" 0x00007fe4c4279ad3 in ?? ()
  26   LWP 14553 "rpc worker-1455" 0x00007fe4c4279ad3 in ?? ()
  27   LWP 14554 "rpc worker-1455" 0x00007fe4c4279ad3 in ?? ()
  28   LWP 14555 "rpc worker-1455" 0x00007fe4c4279ad3 in ?? ()
  29   LWP 14556 "rpc worker-1455" 0x00007fe4c4279ad3 in ?? ()
  30   LWP 14557 "rpc worker-1455" 0x00007fe4c4279ad3 in ?? ()
  31   LWP 14558 "rpc worker-1455" 0x00007fe4c4279ad3 in ?? ()
  32   LWP 14559 "rpc worker-1455" 0x00007fe4c4279ad3 in ?? ()
  33   LWP 14560 "rpc worker-1456" 0x00007fe4c4279ad3 in ?? ()
  34   LWP 14561 "rpc worker-1456" 0x00007fe4c4279ad3 in ?? ()
  35   LWP 14562 "rpc worker-1456" 0x00007fe4c4279ad3 in ?? ()
  36   LWP 14563 "rpc worker-1456" 0x00007fe4c4279ad3 in ?? ()
  37   LWP 14564 "rpc worker-1456" 0x00007fe4c4279ad3 in ?? ()
  38   LWP 14565 "rpc worker-1456" 0x00007fe4c4279ad3 in ?? ()
  39   LWP 14566 "rpc worker-1456" 0x00007fe4c4279ad3 in ?? ()
  40   LWP 14567 "rpc worker-1456" 0x00007fe4c4279ad3 in ?? ()
  41   LWP 14568 "rpc worker-1456" 0x00007fe4c4279ad3 in ?? ()
  42   LWP 14569 "rpc worker-1456" 0x00007fe4c4279ad3 in ?? ()
  43   LWP 14570 "rpc worker-1457" 0x00007fe4c4279ad3 in ?? ()
  44   LWP 14571 "rpc worker-1457" 0x00007fe4c4279ad3 in ?? ()
  45   LWP 14572 "rpc worker-1457" 0x00007fe4c4279ad3 in ?? ()
  46   LWP 14573 "rpc worker-1457" 0x00007fe4c4279ad3 in ?? ()
  47   LWP 14574 "rpc worker-1457" 0x00007fe4c4279ad3 in ?? ()
  48   LWP 14575 "rpc worker-1457" 0x00007fe4c4279ad3 in ?? ()
  49   LWP 14576 "rpc worker-1457" 0x00007fe4c4279ad3 in ?? ()
  50   LWP 14577 "rpc worker-1457" 0x00007fe4c4279ad3 in ?? ()
  51   LWP 14578 "rpc worker-1457" 0x00007fe4c4279ad3 in ?? ()
  52   LWP 14579 "rpc worker-1457" 0x00007fe4c4279ad3 in ?? ()
  53   LWP 14580 "rpc worker-1458" 0x00007fe4c4279ad3 in ?? ()
  54   LWP 14581 "rpc worker-1458" 0x00007fe4c4279ad3 in ?? ()
  55   LWP 14582 "rpc worker-1458" 0x00007fe4c4279ad3 in ?? ()
  56   LWP 14583 "rpc worker-1458" 0x00007fe4c4279ad3 in ?? ()
  57   LWP 14584 "rpc worker-1458" 0x00007fe4c4279ad3 in ?? ()
  58   LWP 14585 "rpc worker-1458" 0x00007fe4c4279ad3 in ?? ()
  59   LWP 14586 "rpc worker-1458" 0x00007fe4c4279ad3 in ?? ()
  60   LWP 14587 "rpc worker-1458" 0x00007fe4c4279ad3 in ?? ()
  61   LWP 14588 "rpc worker-1458" 0x00007fe4c4279ad3 in ?? ()
  62   LWP 14589 "rpc worker-1458" 0x00007fe4c4279ad3 in ?? ()
  63   LWP 14590 "rpc worker-1459" 0x00007fe4c4279ad3 in ?? ()
  64   LWP 14591 "rpc worker-1459" 0x00007fe4c4279ad3 in ?? ()
  65   LWP 14592 "rpc worker-1459" 0x00007fe4c4279ad3 in ?? ()
  66   LWP 14593 "rpc worker-1459" 0x00007fe4c4279ad3 in ?? ()
  67   LWP 14594 "rpc worker-1459" 0x00007fe4c4279ad3 in ?? ()
  68   LWP 14595 "rpc worker-1459" 0x00007fe4c4279ad3 in ?? ()
  69   LWP 14596 "rpc worker-1459" 0x00007fe4c4279ad3 in ?? ()
  70   LWP 14597 "rpc worker-1459" 0x00007fe4c4279ad3 in ?? ()
  71   LWP 14598 "rpc worker-1459" 0x00007fe4c4279ad3 in ?? ()
  72   LWP 14599 "rpc worker-1459" 0x00007fe4c4279ad3 in ?? ()
  73   LWP 14600 "rpc worker-1460" 0x00007fe4c4279ad3 in ?? ()
  74   LWP 14601 "rpc worker-1460" 0x00007fe4c4279ad3 in ?? ()
  75   LWP 14602 "rpc worker-1460" 0x00007fe4c4279ad3 in ?? ()
  76   LWP 14603 "rpc worker-1460" 0x00007fe4c4279ad3 in ?? ()
  77   LWP 14604 "rpc worker-1460" 0x00007fe4c4279ad3 in ?? ()
  78   LWP 14605 "rpc worker-1460" 0x00007fe4c4279ad3 in ?? ()
  79   LWP 14606 "rpc worker-1460" 0x00007fe4c4279ad3 in ?? ()
  80   LWP 14607 "rpc worker-1460" 0x00007fe4c4279ad3 in ?? ()
  81   LWP 14608 "rpc worker-1460" 0x00007fe4c4279ad3 in ?? ()
  82   LWP 14609 "rpc worker-1460" 0x00007fe4c4279ad3 in ?? ()
  83   LWP 14610 "rpc worker-1461" 0x00007fe4c4279ad3 in ?? ()
  84   LWP 14611 "rpc worker-1461" 0x00007fe4c4279ad3 in ?? ()
  85   LWP 14612 "rpc worker-1461" 0x00007fe4c4279ad3 in ?? ()
  86   LWP 14613 "rpc worker-1461" 0x00007fe4c4279ad3 in ?? ()
  87   LWP 14614 "rpc worker-1461" 0x00007fe4c4279ad3 in ?? ()
  88   LWP 14615 "rpc worker-1461" 0x00007fe4c4279ad3 in ?? ()
  89   LWP 14616 "rpc worker-1461" 0x00007fe4c4279ad3 in ?? ()
  90   LWP 14617 "rpc worker-1461" 0x00007fe4c4279ad3 in ?? ()
  91   LWP 14618 "rpc worker-1461" 0x00007fe4c4279ad3 in ?? ()
  92   LWP 14619 "rpc worker-1461" 0x00007fe4c4279ad3 in ?? ()
  93   LWP 14620 "rpc worker-1462" 0x00007fe4c4279ad3 in ?? ()
  94   LWP 14621 "rpc worker-1462" 0x00007fe4c4279ad3 in ?? ()
  95   LWP 14622 "rpc worker-1462" 0x00007fe4c4279ad3 in ?? ()
  96   LWP 14623 "rpc worker-1462" 0x00007fe4c4279ad3 in ?? ()
  97   LWP 14624 "rpc worker-1462" 0x00007fe4c4279ad3 in ?? ()
  98   LWP 14625 "rpc worker-1462" 0x00007fe4c4279ad3 in ?? ()
  99   LWP 14626 "rpc worker-1462" 0x00007fe4c4279ad3 in ?? ()
  100  LWP 14627 "rpc worker-1462" 0x00007fe4c4279ad3 in ?? ()
  101  LWP 14628 "rpc worker-1462" 0x00007fe4c4279ad3 in ?? ()
  102  LWP 14629 "rpc worker-1462" 0x00007fe4c4279ad3 in ?? ()
  103  LWP 14630 "rpc worker-1463" 0x00007fe4c4279ad3 in ?? ()
  104  LWP 14631 "rpc worker-1463" 0x00007fe4c4279ad3 in ?? ()
  105  LWP 14632 "rpc worker-1463" 0x00007fe4c4279ad3 in ?? ()
  106  LWP 14633 "rpc worker-1463" 0x00007fe4c4279ad3 in ?? ()
  107  LWP 14634 "rpc worker-1463" 0x00007fe4c4279ad3 in ?? ()
  108  LWP 14635 "rpc worker-1463" 0x00007fe4c4279ad3 in ?? ()
  109  LWP 14636 "rpc worker-1463" 0x00007fe4c4279ad3 in ?? ()
  110  LWP 14637 "rpc worker-1463" 0x00007fe4c4279ad3 in ?? ()
  111  LWP 14638 "rpc worker-1463" 0x00007fe4c4279ad3 in ?? ()
  112  LWP 14639 "rpc worker-1463" 0x00007fe4c4279ad3 in ?? ()
  113  LWP 14640 "rpc worker-1464" 0x00007fe4c4279ad3 in ?? ()
  114  LWP 14641 "rpc worker-1464" 0x00007fe4c4279ad3 in ?? ()
  115  LWP 14642 "rpc worker-1464" 0x00007fe4c4279ad3 in ?? ()
  116  LWP 14643 "rpc worker-1464" 0x00007fe4c4279ad3 in ?? ()
  117  LWP 14644 "diag-logger-146" 0x00007fe4c4279fb9 in ?? ()
  118  LWP 14645 "result-tracker-" 0x00007fe4c4279fb9 in ?? ()
  119  LWP 14646 "excess-log-dele" 0x00007fe4c4279fb9 in ?? ()
  120  LWP 14647 "tcmalloc-memory" 0x00007fe4c4279fb9 in ?? ()
  121  LWP 14648 "acceptor-14648" 0x00007fe4c239d0c7 in ?? ()
  122  LWP 14649 "heartbeat-14649" 0x00007fe4c4279fb9 in ?? ()
  123  LWP 14650 "maintenance_sch" 0x00007fe4c4279fb9 in ?? ()
Thread 123 (LWP 14650):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000027 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055c7d0119e50 in ?? ()
#5  0x00007fe47ad20470 in ?? ()
#6  0x000000000000004e in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 122 (LWP 14649):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x000055c7d0083934 in ?? ()
#5  0x00007fe47b5213f0 in ?? ()
#6  0x0000000000000019 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007fe47b521410 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fe47b521470 in ?? ()
#12 0x00007fe4c3eed2e1 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 121 (LWP 14648):
#0  0x00007fe4c239d0c7 in ?? ()
#1  0x00007fe47bd22020 in ?? ()
#2  0x00007fe4c3efdc02 in ?? ()
#3  0x00007fe47bd22020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007fe47bd223e0 in ?? ()
#6  0x00007fe47bd22090 in ?? ()
#7  0x000055c7d003f0f8 in ?? ()
#8  0x00007fe4c3f03699 in ?? ()
#9  0x00007fe47bd22510 in ?? ()
#10 0x00007fe47bd22700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007fe4c427d3a7 in ?? ()
#13 0x00007fe47bd23520 in ?? ()
#14 0x00007fe47bd22260 in ?? ()
#15 0x000055c7d0108000 in ?? ()
#16 0x0000000000000000 in ?? ()
Thread 120 (LWP 14647):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffcec626650 in ?? ()
#5  0x00007fe47c523670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 119 (LWP 14646):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 118 (LWP 14645):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055c7cffb7b70 in ?? ()
#5  0x00007fe47d525680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 117 (LWP 14644):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055c7d0330790 in ?? ()
#5  0x00007fe47dd26550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 116 (LWP 14643):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 115 (LWP 14642):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 114 (LWP 14641):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 113 (LWP 14640):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 112 (LWP 14639):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 111 (LWP 14638):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 110 (LWP 14637):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055c7d0535c3c in ?? ()
#4  0x00007fe48152d5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe48152d5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055c7d0535c28 in ?? ()
#9  0x00007fe4c4279770 in ?? ()
#10 0x00007fe48152d5f0 in ?? ()
#11 0x00007fe48152d650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 109 (LWP 14636):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055c7d0535bbc in ?? ()
#4  0x00007fe481d2e5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe481d2e5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055c7d0535ba8 in ?? ()
#9  0x00007fe4c4279770 in ?? ()
#10 0x00007fe481d2e5f0 in ?? ()
#11 0x00007fe481d2e650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 108 (LWP 14635):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 107 (LWP 14634):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 106 (LWP 14633):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 105 (LWP 14632):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 104 (LWP 14631):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 103 (LWP 14630):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 102 (LWP 14629):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 101 (LWP 14628):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 100 (LWP 14627):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 99 (LWP 14626):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 98 (LWP 14625):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 97 (LWP 14624):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 96 (LWP 14623):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 95 (LWP 14622):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 94 (LWP 14621):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 93 (LWP 14620):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 92 (LWP 14619):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 91 (LWP 14618):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 90 (LWP 14617):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 89 (LWP 14616):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 88 (LWP 14615):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 87 (LWP 14614):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 86 (LWP 14613):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 85 (LWP 14612):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 84 (LWP 14611):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 83 (LWP 14610):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 82 (LWP 14609):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 81 (LWP 14608):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 80 (LWP 14607):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 79 (LWP 14606):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 78 (LWP 14605):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 77 (LWP 14604):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 76 (LWP 14603):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x00000000000001a9 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055c7d04838bc in ?? ()
#4  0x00007fe49254f5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe49254f5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055c7d04838a8 in ?? ()
#9  0x00007fe4c4279770 in ?? ()
#10 0x00007fe49254f5f0 in ?? ()
#11 0x00007fe49254f650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 75 (LWP 14602):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x000000000000038d in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055c7d04837bc in ?? ()
#4  0x00007fe492d505d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe492d505f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055c7d04837a8 in ?? ()
#9  0x00007fe4c4279770 in ?? ()
#10 0x00007fe492d505f0 in ?? ()
#11 0x00007fe492d50650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 74 (LWP 14601):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 73 (LWP 14600):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 72 (LWP 14599):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 71 (LWP 14598):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 70 (LWP 14597):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 69 (LWP 14596):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 68 (LWP 14595):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 67 (LWP 14594):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 66 (LWP 14593):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 65 (LWP 14592):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 64 (LWP 14591):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 63 (LWP 14590):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 62 (LWP 14589):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 61 (LWP 14588):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 60 (LWP 14587):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 59 (LWP 14586):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 58 (LWP 14585):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 57 (LWP 14584):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 56 (LWP 14583):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 55 (LWP 14582):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 54 (LWP 14581):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 53 (LWP 14580):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 52 (LWP 14579):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 51 (LWP 14578):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 50 (LWP 14577):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 49 (LWP 14576):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 48 (LWP 14575):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 47 (LWP 14574):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 46 (LWP 14573):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 45 (LWP 14572):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 44 (LWP 14571):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 43 (LWP 14570):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 42 (LWP 14569):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 41 (LWP 14568):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 40 (LWP 14567):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 39 (LWP 14566):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 38 (LWP 14565):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 37 (LWP 14564):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055c7d0534338 in ?? ()
#4  0x00007fe4a5d765d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe4a5d765f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 36 (LWP 14563):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000001d14 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055c7d05342b8 in ?? ()
#4  0x00007fe4a65775d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe4a65775f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 35 (LWP 14562):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000563 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055c7d05341bc in ?? ()
#4  0x00007fe4a6d785d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe4a6d785f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055c7d05341a8 in ?? ()
#9  0x00007fe4c4279770 in ?? ()
#10 0x00007fe4a6d785f0 in ?? ()
#11 0x00007fe4a6d78650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 34 (LWP 14561):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000001727 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055c7d05340bc in ?? ()
#4  0x00007fe4a75795d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe4a75795f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055c7d05340a8 in ?? ()
#9  0x00007fe4c4279770 in ?? ()
#10 0x00007fe4a75795f0 in ?? ()
#11 0x00007fe4a7579650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 33 (LWP 14560):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x000000000000174e in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055c7d0483fb8 in ?? ()
#4  0x00007fe4a7d7a5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe4a7d7a5f0 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 32 (LWP 14559):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 31 (LWP 14558):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 30 (LWP 14557):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 29 (LWP 14556):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 28 (LWP 14555):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 27 (LWP 14554):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 26 (LWP 14553):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 25 (LWP 14552):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 24 (LWP 14551):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 23 (LWP 14550):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 22 (LWP 14549):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 21 (LWP 14548):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 20 (LWP 14547):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 19 (LWP 14546):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 18 (LWP 14545):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 17 (LWP 14544):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 16 (LWP 14543):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 15 (LWP 14542):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055c7cff9d6c8 in ?? ()
#5  0x00007fe4b0d8c6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 14 (LWP 14540):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 13 (LWP 14539):
#0  0x00007fe4c4279ad3 in ?? ()
#1  0x0000000000000000 in ?? ()
Thread 12 (LWP 14538):
#0  0x00007fe4c239ba47 in ?? ()
#1  0x00007fe4b2d90680 in ?? ()
#2  0x00007fe4bd69f571 in ?? ()
#3  0x00000000000002e3 in ?? ()
#4  0x000055c7d0094e58 in ?? ()
#5  0x00007fe4b2d906c0 in ?? ()
#6  0x00007fe4b2d90840 in ?? ()
#7  0x000055c7d0138d30 in ?? ()
#8  0x00007fe4bd6a125d in ?? ()
#9  0x3fb95d036ec90000 in ?? ()
#10 0x000055c7d0086c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055c7d0086c00 in ?? ()
#13 0x00000000d0094e58 in ?? ()
#14 0x000055c700000000 in ?? ()
#15 0x41da4020d459884d in ?? ()
#16 0x000055c7d0138d30 in ?? ()
#17 0x00007fe4b2d90720 in ?? ()
#18 0x00007fe4bd6a5ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb95d036ec90000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 14537):
#0  0x00007fe4c239ba47 in ?? ()
#1  0x00007fe4b3591680 in ?? ()
#2  0x00007fe4bd69f571 in ?? ()
#3  0x00000000000002e3 in ?? ()
#4  0x000055c7d0095a98 in ?? ()
#5  0x00007fe4b35916c0 in ?? ()
#6  0x00007fe4b3591840 in ?? ()
#7  0x000055c7d0138d30 in ?? ()
#8  0x00007fe4bd6a125d in ?? ()
#9  0x3fb95b7b95878000 in ?? ()
#10 0x000055c7d0085600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055c7d0085600 in ?? ()
#13 0x00000000d0095a98 in ?? ()
#14 0x000055c700000000 in ?? ()
#15 0x41da4020d459884c in ?? ()
#16 0x000055c7d0138d30 in ?? ()
#17 0x00007fe4b3591720 in ?? ()
#18 0x00007fe4bd6a5ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb95b7b95878000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 14536):
#0  0x00007fe4c239ba47 in ?? ()
#1  0x00007fe4b3d92680 in ?? ()
#2  0x00007fe4bd69f571 in ?? ()
#3  0x00000000000002e3 in ?? ()
#4  0x000055c7d0095c58 in ?? ()
#5  0x00007fe4b3d926c0 in ?? ()
#6  0x00007fe4b3d92840 in ?? ()
#7  0x000055c7d0138d30 in ?? ()
#8  0x00007fe4bd6a125d in ?? ()
#9  0x3fa190f39c890000 in ?? ()
#10 0x000055c7d0086680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055c7d0086680 in ?? ()
#13 0x00000000d0095c58 in ?? ()
#14 0x000055c700000000 in ?? ()
#15 0x41da4020d459884e in ?? ()
#16 0x000055c7d0138d30 in ?? ()
#17 0x00007fe4b3d92720 in ?? ()
#18 0x00007fe4bd6a5ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa190f39c890000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 9 (LWP 14535):
#0  0x00007fe4c239ba47 in ?? ()
#1  0x00007fe4b597d680 in ?? ()
#2  0x00007fe4bd69f571 in ?? ()
#3  0x00000000000002e3 in ?? ()
#4  0x000055c7d0095e18 in ?? ()
#5  0x00007fe4b597d6c0 in ?? ()
#6  0x00007fe4b597d840 in ?? ()
#7  0x000055c7d0138d30 in ?? ()
#8  0x00007fe4bd6a125d in ?? ()
#9  0x3fb98822c1cac000 in ?? ()
#10 0x000055c7d0085b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055c7d0085b80 in ?? ()
#13 0x00000000d0095e18 in ?? ()
#14 0x000055c700000000 in ?? ()
#15 0x41da4020d459884e in ?? ()
#16 0x000055c7d0138d30 in ?? ()
#17 0x00007fe4b597d720 in ?? ()
#18 0x00007fe4bd6a5ba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 14532):
#0  0x00007fe4c238ecb9 in ?? ()
#1  0x00007fe4b7180840 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 7 (LWP 14531):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()
Thread 6 (LWP 14530):
#0  0x00007fe4c427d9e2 in ?? ()
#1  0x000055c7cffb7ee0 in ?? ()
#2  0x00007fe4b617e4d0 in ?? ()
#3  0x00007fe4b617e450 in ?? ()
#4  0x00007fe4b617e570 in ?? ()
#5  0x00007fe4b617e790 in ?? ()
#6  0x00007fe4b617e7a0 in ?? ()
#7  0x00007fe4b617e4e0 in ?? ()
#8  0x00007fe4b617e4d0 in ?? ()
#9  0x000055c7cffb6350 in ?? ()
#10 0x00007fe4c4668c6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 14524):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000031 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055c7d013cdc8 in ?? ()
#5  0x00007fe4b8182430 in ?? ()
#6  0x0000000000000062 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 4 (LWP 14523):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055c7cff9c848 in ?? ()
#5  0x00007fe4b8983790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 3 (LWP 14522):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055c7cff9c2a8 in ?? ()
#5  0x00007fe4b9184790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 2 (LWP 14521):
#0  0x00007fe4c4279fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055c7cff9c188 in ?? ()
#5  0x00007fe4b9985790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()
Thread 1 (LWP 14519):
#0  0x00007fe4c427dd50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251028 09:11:06.015051  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 14383
I20251028 09:11:06.026156  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 14250
I20251028 09:11:06.037858  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 14651
I20251028 09:11:06.043370  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 14519
I20251028 09:11:06.056197  8282 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6m1lU8/build/release/bin/kudu with pid 9447
2025-10-28T09:11:06Z chronyd exiting
I20251028 09:11:06.072932  8282 test_util.cc:183] -----------------------------------------------
I20251028 09:11:06.073011  8282 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-task6m1lU8/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761642599042423-8282-0
[  FAILED  ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-C0 60-61 33-56 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-C0 60-61 33-56 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-C2 60-61 33-56 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00> (51342 ms)
[----------] 1 test from RollingRestartArgs/RollingRestartITest (51342 ms total)
[----------] Global test environment tear-down
[==========] 2 tests from 2 test suites ran. (67026 ms total)
[  PASSED  ] 1 test.
[  FAILED  ] 1 test, listed below:
[  FAILED  ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-C0 60-61 33-56 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-C0 60-61 33-56 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-C2 60-61 33-56 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00>
 1 FAILED TEST
I20251028 09:11:06.073545  8282 logging.cc:424] LogThrottler /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/client/meta_cache.cc:302: suppressed but not reported on 25 messages since previous log ~10 seconds ago