Note: This is test shard 1 of 8.
[==========] Running 2 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 1 test from MaintenanceModeRF3ITest
[ RUN ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate
2025-09-02T21:32:00Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-09-02T21:32:00Z Disabled control of system clock
WARNING: Logging before InitGoogleLogging() is written to STDERR
I20250902 21:32:00.934114 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.4.52.254:35631
--webserver_interface=127.4.52.254
--webserver_port=0
--builtin_ntp_servers=127.4.52.212:36103
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.4.52.254:35631 with env {}
W20250902 21:32:01.005137 4315 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:01.005277 4315 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:01.005295 4315 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:01.006429 4315 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250902 21:32:01.006462 4315 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:01.006475 4315 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250902 21:32:01.006486 4315 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250902 21:32:01.007803 4315 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:36103
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.4.52.254:35631
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.4.52.254:35631
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.4.52.254
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.4315
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:01.007970 4315 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:01.008131 4315 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:01.010530 4321 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:01.010636 4315 server_base.cc:1047] running on GCE node
W20250902 21:32:01.010574 4320 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:01.010643 4323 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:01.010989 4315 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:01.011201 4315 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:01.012332 4315 hybrid_clock.cc:648] HybridClock initialized: now 1756848721012315 us; error 35 us; skew 500 ppm
I20250902 21:32:01.013267 4315 webserver.cc:480] Webserver started at http://127.4.52.254:44043/ using document root <none> and password file <none>
I20250902 21:32:01.013433 4315 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:01.013469 4315 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:01.013584 4315 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20250902 21:32:01.014326 4315 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data/instance:
uuid: "3323a6faa7434a218c3dae2aa20a5e9b"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.014596 4315 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal/instance:
uuid: "3323a6faa7434a218c3dae2aa20a5e9b"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.015666 4315 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:01.016206 4329 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.016338 4315 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20250902 21:32:01.016403 4315 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal
uuid: "3323a6faa7434a218c3dae2aa20a5e9b"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.016456 4315 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:01.028038 4315 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:01.028304 4315 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:01.028414 4315 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:01.032030 4315 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.254:35631
I20250902 21:32:01.032086 4381 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.254:35631 every 8 connection(s)
I20250902 21:32:01.032344 4315 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data/info.pb
I20250902 21:32:01.032881 4382 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.035027 4382 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: Bootstrap starting.
I20250902 21:32:01.035594 4382 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.035843 4382 log.cc:826] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:01.036357 4382 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: No bootstrap required, opened a new log
I20250902 21:32:01.037549 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 4315
I20250902 21:32:01.037642 4307 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal/instance
I20250902 21:32:01.037533 4382 raft_consensus.cc:357] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } }
I20250902 21:32:01.037673 4382 raft_consensus.cc:383] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.037714 4382 raft_consensus.cc:738] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3323a6faa7434a218c3dae2aa20a5e9b, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.037842 4382 consensus_queue.cc:260] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } }
I20250902 21:32:01.037902 4382 raft_consensus.cc:397] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250902 21:32:01.037928 4382 raft_consensus.cc:491] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250902 21:32:01.037966 4382 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.038610 4382 raft_consensus.cc:513] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } }
I20250902 21:32:01.038712 4382 leader_election.cc:304] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 3323a6faa7434a218c3dae2aa20a5e9b; no voters:
I20250902 21:32:01.038877 4382 leader_election.cc:290] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [CANDIDATE]: Term 1 election: Requested vote from peers
I20250902 21:32:01.038930 4385 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 1 FOLLOWER]: Leader election won for term 1
I20250902 21:32:01.039146 4385 raft_consensus.cc:695] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 1 LEADER]: Becoming Leader. State: Replica: 3323a6faa7434a218c3dae2aa20a5e9b, State: Running, Role: LEADER
I20250902 21:32:01.039194 4382 sys_catalog.cc:564] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [sys.catalog]: configured and running, proceeding with master startup.
I20250902 21:32:01.039297 4385 consensus_queue.cc:237] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } }
I20250902 21:32:01.039647 4388 sys_catalog.cc:455] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } } }
I20250902 21:32:01.039835 4388 sys_catalog.cc:458] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [sys.catalog]: This master's current role is: LEADER
I20250902 21:32:01.039775 4389 sys_catalog.cc:455] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [sys.catalog]: SysCatalogTable state changed. Reason: New leader 3323a6faa7434a218c3dae2aa20a5e9b. Latest consensus state: current_term: 1 leader_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } } }
I20250902 21:32:01.040031 4389 sys_catalog.cc:458] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [sys.catalog]: This master's current role is: LEADER
I20250902 21:32:01.040208 4396 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250902 21:32:01.040619 4396 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250902 21:32:01.042078 4396 catalog_manager.cc:1349] Generated new cluster ID: 8d38126b93b947b29dc440ea611353c0
I20250902 21:32:01.042137 4396 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250902 21:32:01.050580 4396 catalog_manager.cc:1372] Generated new certificate authority record
I20250902 21:32:01.051081 4396 catalog_manager.cc:1506] Loading token signing keys...
I20250902 21:32:01.056917 4396 catalog_manager.cc:5955] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: Generated new TSK 0
I20250902 21:32:01.057075 4396 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250902 21:32:01.063400 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.193:0
--local_ip_for_outbound_sockets=127.4.52.193
--webserver_interface=127.4.52.193
--webserver_port=0
--tserver_master_addrs=127.4.52.254:35631
--builtin_ntp_servers=127.4.52.212:36103
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20250902 21:32:01.136269 4406 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:01.136418 4406 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:01.136447 4406 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20250902 21:32:01.136466 4406 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:01.137720 4406 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:01.137774 4406 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.193
I20250902 21:32:01.139155 4406 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:36103
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.193:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.4.52.193
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.4.52.254:35631
--never_fsync=true
--heap_profile_path=/tmp/kudu.4406
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.193
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:01.139396 4406 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:01.139631 4406 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:01.142000 4412 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:01.142045 4411 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:01.142197 4414 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:01.142496 4406 server_base.cc:1047] running on GCE node
I20250902 21:32:01.142656 4406 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:01.142901 4406 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:01.144060 4406 hybrid_clock.cc:648] HybridClock initialized: now 1756848721144042 us; error 32 us; skew 500 ppm
I20250902 21:32:01.145227 4406 webserver.cc:480] Webserver started at http://127.4.52.193:40731/ using document root <none> and password file <none>
I20250902 21:32:01.145450 4406 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:01.145499 4406 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:01.145610 4406 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20250902 21:32:01.146677 4406 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data/instance:
uuid: "4dd1d49578df44b0a0325def79d07969"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.147089 4406 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal/instance:
uuid: "4dd1d49578df44b0a0325def79d07969"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.148461 4406 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:01.149183 4420 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.149343 4406 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:01.149420 4406 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal
uuid: "4dd1d49578df44b0a0325def79d07969"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.149477 4406 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:01.164357 4406 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:01.164616 4406 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:01.164723 4406 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:01.164927 4406 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:01.165246 4406 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:01.165280 4406 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.165302 4406 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:01.165331 4406 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.172062 4406 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.193:41549
I20250902 21:32:01.172128 4533 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.193:41549 every 8 connection(s)
I20250902 21:32:01.172375 4406 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
I20250902 21:32:01.176595 4534 heartbeater.cc:344] Connected to a master server at 127.4.52.254:35631
I20250902 21:32:01.176695 4534 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:01.176889 4534 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:01.177260 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 4406
I20250902 21:32:01.177336 4307 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal/instance
I20250902 21:32:01.177342 4346 ts_manager.cc:194] Registered new tserver with Master: 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549)
I20250902 21:32:01.178081 4346 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.193:56143
I20250902 21:32:01.178661 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.194:0
--local_ip_for_outbound_sockets=127.4.52.194
--webserver_interface=127.4.52.194
--webserver_port=0
--tserver_master_addrs=127.4.52.254:35631
--builtin_ntp_servers=127.4.52.212:36103
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20250902 21:32:01.256523 4537 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:01.256670 4537 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:01.256686 4537 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20250902 21:32:01.256698 4537 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:01.257895 4537 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:01.257943 4537 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.194
I20250902 21:32:01.259305 4537 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:36103
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.194:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.4.52.194
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.4.52.254:35631
--never_fsync=true
--heap_profile_path=/tmp/kudu.4537
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.194
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:01.259497 4537 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:01.259735 4537 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:01.262017 4543 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:01.262028 4542 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:01.262161 4545 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:01.262176 4537 server_base.cc:1047] running on GCE node
I20250902 21:32:01.262364 4537 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:01.262563 4537 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:01.263688 4537 hybrid_clock.cc:648] HybridClock initialized: now 1756848721263673 us; error 33 us; skew 500 ppm
I20250902 21:32:01.264688 4537 webserver.cc:480] Webserver started at http://127.4.52.194:42975/ using document root <none> and password file <none>
I20250902 21:32:01.264884 4537 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:01.264927 4537 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:01.265033 4537 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20250902 21:32:01.265813 4537 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/data/instance:
uuid: "9a6e0058b145476b9c65606690bad44c"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.266101 4537 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/wal/instance:
uuid: "9a6e0058b145476b9c65606690bad44c"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.267138 4537 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.000s sys 0.002s
I20250902 21:32:01.267673 4551 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.267819 4537 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20250902 21:32:01.267880 4537 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/wal
uuid: "9a6e0058b145476b9c65606690bad44c"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.267932 4537 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:01.298122 4537 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:01.298429 4537 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:01.298554 4537 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:01.298748 4537 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:01.299049 4537 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:01.299078 4537 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.299110 4537 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:01.299129 4537 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.304318 4537 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.194:33533
I20250902 21:32:01.304396 4664 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.194:33533 every 8 connection(s)
I20250902 21:32:01.304638 4537 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
I20250902 21:32:01.308583 4665 heartbeater.cc:344] Connected to a master server at 127.4.52.254:35631
I20250902 21:32:01.308660 4665 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:01.308799 4665 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:01.309079 4345 ts_manager.cc:194] Registered new tserver with Master: 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533)
I20250902 21:32:01.309393 4345 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.194:36281
I20250902 21:32:01.313355 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 4537
I20250902 21:32:01.313414 4307 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-1/wal/instance
I20250902 21:32:01.314244 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.195:0
--local_ip_for_outbound_sockets=127.4.52.195
--webserver_interface=127.4.52.195
--webserver_port=0
--tserver_master_addrs=127.4.52.254:35631
--builtin_ntp_servers=127.4.52.212:36103
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20250902 21:32:01.385543 4668 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:01.385694 4668 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:01.385708 4668 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20250902 21:32:01.385739 4668 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:01.387029 4668 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:01.387072 4668 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.195
I20250902 21:32:01.388437 4668 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:36103
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.195:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.4.52.195
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.4.52.254:35631
--never_fsync=true
--heap_profile_path=/tmp/kudu.4668
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.195
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:01.388622 4668 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:01.388847 4668 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:01.391063 4673 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:01.391052 4674 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:01.391119 4676 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:01.391196 4668 server_base.cc:1047] running on GCE node
I20250902 21:32:01.391404 4668 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:01.391556 4668 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:01.392679 4668 hybrid_clock.cc:648] HybridClock initialized: now 1756848721392662 us; error 30 us; skew 500 ppm
I20250902 21:32:01.393658 4668 webserver.cc:480] Webserver started at http://127.4.52.195:46813/ using document root <none> and password file <none>
I20250902 21:32:01.393826 4668 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:01.393867 4668 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:01.393965 4668 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20250902 21:32:01.394719 4668 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/data/instance:
uuid: "06e82aae52a24d6db2e51581ee7ca9ed"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.395035 4668 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/wal/instance:
uuid: "06e82aae52a24d6db2e51581ee7ca9ed"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.396111 4668 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:01.396664 4682 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.396809 4668 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:01.396874 4668 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/wal
uuid: "06e82aae52a24d6db2e51581ee7ca9ed"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.396929 4668 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:01.408052 4668 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:01.408308 4668 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:01.408419 4668 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:01.408622 4668 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:01.408885 4668 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:01.408915 4668 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.408946 4668 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:01.408975 4668 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.416699 4668 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.195:36693
I20250902 21:32:01.416783 4795 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.195:36693 every 8 connection(s)
I20250902 21:32:01.417035 4668 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
I20250902 21:32:01.417991 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 4668
I20250902 21:32:01.418078 4307 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-2/wal/instance
I20250902 21:32:01.421495 4796 heartbeater.cc:344] Connected to a master server at 127.4.52.254:35631
I20250902 21:32:01.421599 4796 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:01.421751 4796 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:01.421981 4345 ts_manager.cc:194] Registered new tserver with Master: 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:01.422343 4345 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.195:55327
I20250902 21:32:01.429831 4307 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20250902 21:32:01.435072 4307 test_util.cc:276] Using random seed: 938940861
I20250902 21:32:01.440793 4345 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:32988:
name: "test-workload"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
rows: "<redacted>""\004\001\000UUU\025\004\001\000\252\252\252*\004\001\000\377\377\377?\004\001\000TUUU\004\001\000\251\252\252j"
indirect_data: "<redacted>"""
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
W20250902 21:32:01.441063 4345 catalog_manager.cc:6944] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-workload in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20250902 21:32:01.447027 4596 tablet_service.cc:1468] Processing CreateTablet for tablet 2f100030f5ec42f085bf83f379ebb850 (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20250902 21:32:01.446976 4595 tablet_service.cc:1468] Processing CreateTablet for tablet 8c955e3402124b248b2740afe7cdff4d (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20250902 21:32:01.447144 4594 tablet_service.cc:1468] Processing CreateTablet for tablet 32d5814bb7e34ba8a525018a3d441fc7 (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20250902 21:32:01.446976 4599 tablet_service.cc:1468] Processing CreateTablet for tablet 71a6015393b644f5abbd15f20c69a5ec (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION VALUES < 357913941
I20250902 21:32:01.447388 4598 tablet_service.cc:1468] Processing CreateTablet for tablet b3f81968196f41b3a5e3bead654d937a (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20250902 21:32:01.447430 4597 tablet_service.cc:1468] Processing CreateTablet for tablet 85463c5915304980b0a2aba153ed3da0 (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20250902 21:32:01.447351 4596 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 2f100030f5ec42f085bf83f379ebb850. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.447649 4597 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 85463c5915304980b0a2aba153ed3da0. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.448063 4468 tablet_service.cc:1468] Processing CreateTablet for tablet 71a6015393b644f5abbd15f20c69a5ec (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION VALUES < 357913941
I20250902 21:32:01.448238 4599 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 71a6015393b644f5abbd15f20c69a5ec. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.448354 4468 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 71a6015393b644f5abbd15f20c69a5ec. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.448949 4727 tablet_service.cc:1468] Processing CreateTablet for tablet 2f100030f5ec42f085bf83f379ebb850 (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20250902 21:32:01.449043 4729 tablet_service.cc:1468] Processing CreateTablet for tablet b3f81968196f41b3a5e3bead654d937a (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20250902 21:32:01.448915 4728 tablet_service.cc:1468] Processing CreateTablet for tablet 85463c5915304980b0a2aba153ed3da0 (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20250902 21:32:01.449185 4727 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 2f100030f5ec42f085bf83f379ebb850. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.449266 4728 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 85463c5915304980b0a2aba153ed3da0. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.450819 4467 tablet_service.cc:1468] Processing CreateTablet for tablet b3f81968196f41b3a5e3bead654d937a (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20250902 21:32:01.450910 4467 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b3f81968196f41b3a5e3bead654d937a. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.451584 4817 tablet_bootstrap.cc:492] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed: Bootstrap starting.
I20250902 21:32:01.451654 4725 tablet_service.cc:1468] Processing CreateTablet for tablet 32d5814bb7e34ba8a525018a3d441fc7 (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20250902 21:32:01.451659 4816 tablet_bootstrap.cc:492] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
I20250902 21:32:01.451727 4725 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 32d5814bb7e34ba8a525018a3d441fc7. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.452095 4466 tablet_service.cc:1468] Processing CreateTablet for tablet 85463c5915304980b0a2aba153ed3da0 (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20250902 21:32:01.452174 4466 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 85463c5915304980b0a2aba153ed3da0. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.452193 4817 tablet_bootstrap.cc:654] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.452212 4816 tablet_bootstrap.cc:654] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.452454 4817 log.cc:826] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:01.452464 4816 log.cc:826] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:01.452747 4815 tablet_bootstrap.cc:492] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c: Bootstrap starting.
I20250902 21:32:01.451597 4726 tablet_service.cc:1468] Processing CreateTablet for tablet 8c955e3402124b248b2740afe7cdff4d (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20250902 21:32:01.452926 4726 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 8c955e3402124b248b2740afe7cdff4d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.453317 4815 tablet_bootstrap.cc:654] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.453464 4465 tablet_service.cc:1468] Processing CreateTablet for tablet 2f100030f5ec42f085bf83f379ebb850 (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20250902 21:32:01.453538 4465 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 2f100030f5ec42f085bf83f379ebb850. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.453584 4815 log.cc:826] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:01.454018 4729 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b3f81968196f41b3a5e3bead654d937a. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.454258 4815 tablet_bootstrap.cc:492] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c: No bootstrap required, opened a new log
I20250902 21:32:01.454305 4815 ts_tablet_manager.cc:1397] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c: Time spent bootstrapping tablet: real 0.002s user 0.001s sys 0.000s
I20250902 21:32:01.454643 4464 tablet_service.cc:1468] Processing CreateTablet for tablet 8c955e3402124b248b2740afe7cdff4d (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20250902 21:32:01.454715 4464 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 8c955e3402124b248b2740afe7cdff4d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.448913 4730 tablet_service.cc:1468] Processing CreateTablet for tablet 71a6015393b644f5abbd15f20c69a5ec (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION VALUES < 357913941
I20250902 21:32:01.455066 4463 tablet_service.cc:1468] Processing CreateTablet for tablet 32d5814bb7e34ba8a525018a3d441fc7 (DEFAULT_TABLE table=test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20250902 21:32:01.455119 4730 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 71a6015393b644f5abbd15f20c69a5ec. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.455152 4463 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 32d5814bb7e34ba8a525018a3d441fc7. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.456039 4815 raft_consensus.cc:357] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.456128 4815 raft_consensus.cc:383] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.456146 4815 raft_consensus.cc:738] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9a6e0058b145476b9c65606690bad44c, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.456216 4815 consensus_queue.cc:260] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.456420 4815 ts_tablet_manager.cc:1428] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c: Time spent starting tablet: real 0.002s user 0.002s sys 0.000s
I20250902 21:32:01.456444 4665 heartbeater.cc:499] Master 127.4.52.254:35631 was elected leader, sending a full tablet report...
I20250902 21:32:01.456642 4595 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 8c955e3402124b248b2740afe7cdff4d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.457266 4815 tablet_bootstrap.cc:492] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c: Bootstrap starting.
I20250902 21:32:01.457679 4815 tablet_bootstrap.cc:654] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.457746 4594 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 32d5814bb7e34ba8a525018a3d441fc7. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.458210 4815 tablet_bootstrap.cc:492] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c: No bootstrap required, opened a new log
I20250902 21:32:01.458266 4815 ts_tablet_manager.cc:1397] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c: Time spent bootstrapping tablet: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:01.458398 4815 raft_consensus.cc:357] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } }
I20250902 21:32:01.458460 4815 raft_consensus.cc:383] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.458484 4815 raft_consensus.cc:738] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9a6e0058b145476b9c65606690bad44c, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.458523 4815 consensus_queue.cc:260] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } }
I20250902 21:32:01.458606 4815 ts_tablet_manager.cc:1428] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.458648 4815 tablet_bootstrap.cc:492] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c: Bootstrap starting.
I20250902 21:32:01.458967 4598 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b3f81968196f41b3a5e3bead654d937a. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:01.459022 4815 tablet_bootstrap.cc:654] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.459518 4815 tablet_bootstrap.cc:492] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c: No bootstrap required, opened a new log
I20250902 21:32:01.459575 4815 ts_tablet_manager.cc:1397] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c: Time spent bootstrapping tablet: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:01.459790 4815 raft_consensus.cc:357] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.459837 4815 raft_consensus.cc:383] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.459862 4815 raft_consensus.cc:738] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9a6e0058b145476b9c65606690bad44c, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.459928 4815 consensus_queue.cc:260] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.460006 4815 ts_tablet_manager.cc:1428] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c: Time spent starting tablet: real 0.000s user 0.000s sys 0.001s
I20250902 21:32:01.460070 4815 tablet_bootstrap.cc:492] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c: Bootstrap starting.
I20250902 21:32:01.460351 4816 tablet_bootstrap.cc:492] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969: No bootstrap required, opened a new log
I20250902 21:32:01.460407 4816 ts_tablet_manager.cc:1397] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.009s user 0.001s sys 0.000s
I20250902 21:32:01.460443 4815 tablet_bootstrap.cc:654] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.460469 4817 tablet_bootstrap.cc:492] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed: No bootstrap required, opened a new log
I20250902 21:32:01.460521 4817 ts_tablet_manager.cc:1397] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent bootstrapping tablet: real 0.009s user 0.001s sys 0.000s
I20250902 21:32:01.460989 4815 tablet_bootstrap.cc:492] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c: No bootstrap required, opened a new log
I20250902 21:32:01.461024 4815 ts_tablet_manager.cc:1397] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c: Time spent bootstrapping tablet: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:01.461134 4815 raft_consensus.cc:357] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.461179 4815 raft_consensus.cc:383] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.461195 4815 raft_consensus.cc:738] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9a6e0058b145476b9c65606690bad44c, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.461238 4815 consensus_queue.cc:260] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.461319 4815 ts_tablet_manager.cc:1428] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.461360 4815 tablet_bootstrap.cc:492] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c: Bootstrap starting.
I20250902 21:32:01.461714 4815 tablet_bootstrap.cc:654] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.461951 4816 raft_consensus.cc:357] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } }
I20250902 21:32:01.462060 4816 raft_consensus.cc:383] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.462090 4816 raft_consensus.cc:738] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.462052 4817 raft_consensus.cc:357] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.462134 4817 raft_consensus.cc:383] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.462159 4817 raft_consensus.cc:738] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 06e82aae52a24d6db2e51581ee7ca9ed, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.462167 4816 consensus_queue.cc:260] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } }
I20250902 21:32:01.462239 4817 consensus_queue.cc:260] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.462287 4815 tablet_bootstrap.cc:492] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c: No bootstrap required, opened a new log
I20250902 21:32:01.462319 4815 ts_tablet_manager.cc:1397] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c: Time spent bootstrapping tablet: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:01.462373 4816 ts_tablet_manager.cc:1428] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.002s user 0.002s sys 0.000s
I20250902 21:32:01.462389 4534 heartbeater.cc:499] Master 127.4.52.254:35631 was elected leader, sending a full tablet report...
I20250902 21:32:01.462438 4816 tablet_bootstrap.cc:492] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
I20250902 21:32:01.462453 4817 ts_tablet_manager.cc:1428] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent starting tablet: real 0.002s user 0.002s sys 0.000s
I20250902 21:32:01.462455 4815 raft_consensus.cc:357] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.462500 4815 raft_consensus.cc:383] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.462518 4815 raft_consensus.cc:738] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9a6e0058b145476b9c65606690bad44c, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.462549 4817 tablet_bootstrap.cc:492] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed: Bootstrap starting.
I20250902 21:32:01.462575 4815 consensus_queue.cc:260] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.462652 4815 ts_tablet_manager.cc:1428] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.462703 4815 tablet_bootstrap.cc:492] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c: Bootstrap starting.
I20250902 21:32:01.462844 4816 tablet_bootstrap.cc:654] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.462983 4817 tablet_bootstrap.cc:654] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.463100 4815 tablet_bootstrap.cc:654] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.463295 4796 heartbeater.cc:499] Master 127.4.52.254:35631 was elected leader, sending a full tablet report...
I20250902 21:32:01.463347 4821 raft_consensus.cc:491] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:01.463413 4821 raft_consensus.cc:513] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.463654 4821 leader_election.cc:290] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:01.464383 4816 tablet_bootstrap.cc:492] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969: No bootstrap required, opened a new log
I20250902 21:32:01.464429 4816 ts_tablet_manager.cc:1397] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.002s user 0.000s sys 0.001s
I20250902 21:32:01.464562 4816 raft_consensus.cc:357] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.464612 4816 raft_consensus.cc:383] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.464632 4816 raft_consensus.cc:738] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.464674 4816 consensus_queue.cc:260] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.464738 4816 ts_tablet_manager.cc:1428] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.464776 4816 tablet_bootstrap.cc:492] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
I20250902 21:32:01.465159 4816 tablet_bootstrap.cc:654] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.465620 4817 tablet_bootstrap.cc:492] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed: No bootstrap required, opened a new log
I20250902 21:32:01.465664 4817 ts_tablet_manager.cc:1397] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent bootstrapping tablet: real 0.003s user 0.001s sys 0.000s
I20250902 21:32:01.465772 4817 raft_consensus.cc:357] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.465804 4817 raft_consensus.cc:383] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.465816 4817 raft_consensus.cc:738] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 06e82aae52a24d6db2e51581ee7ca9ed, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.465847 4817 consensus_queue.cc:260] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.465919 4817 ts_tablet_manager.cc:1428] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.465956 4817 tablet_bootstrap.cc:492] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed: Bootstrap starting.
I20250902 21:32:01.466336 4817 tablet_bootstrap.cc:654] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.466840 4815 tablet_bootstrap.cc:492] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c: No bootstrap required, opened a new log
I20250902 21:32:01.466881 4815 ts_tablet_manager.cc:1397] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c: Time spent bootstrapping tablet: real 0.004s user 0.001s sys 0.000s
I20250902 21:32:01.467007 4815 raft_consensus.cc:357] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.467067 4815 raft_consensus.cc:383] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.467103 4815 raft_consensus.cc:738] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9a6e0058b145476b9c65606690bad44c, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.467140 4817 tablet_bootstrap.cc:492] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed: No bootstrap required, opened a new log
I20250902 21:32:01.467152 4815 consensus_queue.cc:260] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.467181 4817 ts_tablet_manager.cc:1397] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent bootstrapping tablet: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:01.467295 4750 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "85463c5915304980b0a2aba153ed3da0" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" is_pre_election: true
I20250902 21:32:01.467332 4817 raft_consensus.cc:357] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.467384 4817 raft_consensus.cc:383] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.467388 4815 ts_tablet_manager.cc:1428] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.467403 4817 raft_consensus.cc:738] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 06e82aae52a24d6db2e51581ee7ca9ed, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.467413 4750 raft_consensus.cc:2466] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 0.
I20250902 21:32:01.467401 4488 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "85463c5915304980b0a2aba153ed3da0" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "4dd1d49578df44b0a0325def79d07969" is_pre_election: true
I20250902 21:32:01.467530 4817 consensus_queue.cc:260] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
W20250902 21:32:01.467679 4553 leader_election.cc:343] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:01.467722 4817 ts_tablet_manager.cc:1428] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent starting tablet: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:01.467780 4817 tablet_bootstrap.cc:492] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed: Bootstrap starting.
I20250902 21:32:01.467806 4552 leader_election.cc:304] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 06e82aae52a24d6db2e51581ee7ca9ed, 9a6e0058b145476b9c65606690bad44c; no voters: 4dd1d49578df44b0a0325def79d07969
I20250902 21:32:01.467938 4821 raft_consensus.cc:2802] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250902 21:32:01.468001 4821 raft_consensus.cc:491] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:01.468045 4821 raft_consensus.cc:3058] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.468256 4817 tablet_bootstrap.cc:654] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.468562 4821 raft_consensus.cc:513] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.468700 4821 leader_election.cc:290] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 election: Requested vote from peers 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:01.468777 4817 tablet_bootstrap.cc:492] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed: No bootstrap required, opened a new log
I20250902 21:32:01.468784 4816 tablet_bootstrap.cc:492] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969: No bootstrap required, opened a new log
I20250902 21:32:01.468816 4817 ts_tablet_manager.cc:1397] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent bootstrapping tablet: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:01.468820 4816 ts_tablet_manager.cc:1397] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.004s user 0.000s sys 0.001s
I20250902 21:32:01.468824 4488 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "85463c5915304980b0a2aba153ed3da0" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "4dd1d49578df44b0a0325def79d07969"
I20250902 21:32:01.468930 4750 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "85463c5915304980b0a2aba153ed3da0" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed"
I20250902 21:32:01.468972 4750 raft_consensus.cc:3058] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.468977 4817 raft_consensus.cc:357] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.469064 4817 raft_consensus.cc:383] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.469084 4817 raft_consensus.cc:738] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 06e82aae52a24d6db2e51581ee7ca9ed, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.469123 4817 consensus_queue.cc:260] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.469120 4816 raft_consensus.cc:357] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.469192 4816 raft_consensus.cc:383] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.469197 4817 ts_tablet_manager.cc:1428] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.469214 4816 raft_consensus.cc:738] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
W20250902 21:32:01.469208 4553 leader_election.cc:343] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 election: Tablet error from VoteRequest() call to peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:01.469254 4817 tablet_bootstrap.cc:492] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed: Bootstrap starting.
I20250902 21:32:01.469277 4816 consensus_queue.cc:260] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.469362 4816 ts_tablet_manager.cc:1428] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.000s user 0.000s sys 0.001s
I20250902 21:32:01.469412 4816 tablet_bootstrap.cc:492] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
I20250902 21:32:01.469851 4816 tablet_bootstrap.cc:654] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.469669 4817 tablet_bootstrap.cc:654] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.469760 4750 raft_consensus.cc:2466] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 1.
I20250902 21:32:01.470228 4552 leader_election.cc:304] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 06e82aae52a24d6db2e51581ee7ca9ed, 9a6e0058b145476b9c65606690bad44c; no voters: 4dd1d49578df44b0a0325def79d07969
I20250902 21:32:01.470324 4821 raft_consensus.cc:2802] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Leader election won for term 1
I20250902 21:32:01.470443 4821 raft_consensus.cc:695] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 1 LEADER]: Becoming Leader. State: Replica: 9a6e0058b145476b9c65606690bad44c, State: Running, Role: LEADER
I20250902 21:32:01.470520 4821 consensus_queue.cc:237] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.470621 4817 tablet_bootstrap.cc:492] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed: No bootstrap required, opened a new log
I20250902 21:32:01.470679 4817 ts_tablet_manager.cc:1397] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent bootstrapping tablet: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:01.470687 4816 tablet_bootstrap.cc:492] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969: No bootstrap required, opened a new log
I20250902 21:32:01.470722 4816 ts_tablet_manager.cc:1397] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:01.470863 4817 raft_consensus.cc:357] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.470889 4816 raft_consensus.cc:357] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.470930 4817 raft_consensus.cc:383] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.470942 4816 raft_consensus.cc:383] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.470956 4817 raft_consensus.cc:738] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 06e82aae52a24d6db2e51581ee7ca9ed, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.470971 4816 raft_consensus.cc:738] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.470994 4817 consensus_queue.cc:260] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.471010 4816 consensus_queue.cc:260] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.471091 4816 ts_tablet_manager.cc:1428] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.471097 4817 ts_tablet_manager.cc:1428] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent starting tablet: real 0.000s user 0.000s sys 0.001s
I20250902 21:32:01.471154 4817 tablet_bootstrap.cc:492] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed: Bootstrap starting.
I20250902 21:32:01.471149 4816 tablet_bootstrap.cc:492] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
I20250902 21:32:01.471342 4345 catalog_manager.cc:5582] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c reported cstate change: term changed from 0 to 1, leader changed from <none> to 9a6e0058b145476b9c65606690bad44c (127.4.52.194). New cstate: current_term: 1 leader_uuid: "9a6e0058b145476b9c65606690bad44c" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:01.471566 4816 tablet_bootstrap.cc:654] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.471719 4817 tablet_bootstrap.cc:654] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.472277 4817 tablet_bootstrap.cc:492] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed: No bootstrap required, opened a new log
I20250902 21:32:01.472318 4817 ts_tablet_manager.cc:1397] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent bootstrapping tablet: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:01.472456 4817 raft_consensus.cc:357] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } }
I20250902 21:32:01.472541 4817 raft_consensus.cc:383] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.472568 4817 raft_consensus.cc:738] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 06e82aae52a24d6db2e51581ee7ca9ed, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.472608 4817 consensus_queue.cc:260] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } }
I20250902 21:32:01.472644 4816 tablet_bootstrap.cc:492] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969: No bootstrap required, opened a new log
I20250902 21:32:01.472690 4816 ts_tablet_manager.cc:1397] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.002s user 0.000s sys 0.001s
I20250902 21:32:01.472690 4817 ts_tablet_manager.cc:1428] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.472821 4816 raft_consensus.cc:357] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.472873 4816 raft_consensus.cc:383] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.472887 4816 raft_consensus.cc:738] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.472919 4816 consensus_queue.cc:260] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.473007 4816 ts_tablet_manager.cc:1428] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.473057 4816 tablet_bootstrap.cc:492] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
I20250902 21:32:01.473608 4816 tablet_bootstrap.cc:654] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:01.474053 4816 tablet_bootstrap.cc:492] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969: No bootstrap required, opened a new log
I20250902 21:32:01.474088 4816 ts_tablet_manager.cc:1397] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:01.474195 4816 raft_consensus.cc:357] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.474237 4816 raft_consensus.cc:383] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:01.474251 4816 raft_consensus.cc:738] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
I20250902 21:32:01.474299 4816 consensus_queue.cc:260] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.474371 4816 ts_tablet_manager.cc:1428] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.479076 4826 raft_consensus.cc:491] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:01.479136 4826 raft_consensus.cc:513] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.479324 4826 leader_election.cc:290] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:01.481638 4619 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "85463c5915304980b0a2aba153ed3da0" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c" is_pre_election: true
I20250902 21:32:01.481786 4750 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "85463c5915304980b0a2aba153ed3da0" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" is_pre_election: true
I20250902 21:32:01.481861 4750 raft_consensus.cc:2391] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 4dd1d49578df44b0a0325def79d07969 in current term 1: Already voted for candidate 9a6e0058b145476b9c65606690bad44c in this term.
I20250902 21:32:01.482044 4421 leader_election.cc:304] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 4dd1d49578df44b0a0325def79d07969; no voters: 06e82aae52a24d6db2e51581ee7ca9ed, 9a6e0058b145476b9c65606690bad44c
I20250902 21:32:01.482136 4826 raft_consensus.cc:3058] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.482659 4826 raft_consensus.cc:2747] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20250902 21:32:01.493314 4834 raft_consensus.cc:491] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:01.493383 4834 raft_consensus.cc:513] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.493633 4834 leader_election.cc:290] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549), 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533)
I20250902 21:32:01.495831 4488 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "32d5814bb7e34ba8a525018a3d441fc7" candidate_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "4dd1d49578df44b0a0325def79d07969" is_pre_election: true
I20250902 21:32:01.495909 4488 raft_consensus.cc:2466] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 06e82aae52a24d6db2e51581ee7ca9ed in term 0.
I20250902 21:32:01.496132 4684 leader_election.cc:304] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 06e82aae52a24d6db2e51581ee7ca9ed, 4dd1d49578df44b0a0325def79d07969; no voters:
I20250902 21:32:01.496124 4619 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "32d5814bb7e34ba8a525018a3d441fc7" candidate_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c" is_pre_election: true
I20250902 21:32:01.496191 4619 raft_consensus.cc:2466] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 06e82aae52a24d6db2e51581ee7ca9ed in term 0.
I20250902 21:32:01.496243 4834 raft_consensus.cc:2802] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250902 21:32:01.496309 4834 raft_consensus.cc:491] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:01.496340 4834 raft_consensus.cc:3058] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.496850 4834 raft_consensus.cc:513] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.496950 4834 leader_election.cc:290] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [CANDIDATE]: Term 1 election: Requested vote from peers 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549), 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533)
I20250902 21:32:01.497104 4488 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "32d5814bb7e34ba8a525018a3d441fc7" candidate_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "4dd1d49578df44b0a0325def79d07969"
I20250902 21:32:01.497159 4488 raft_consensus.cc:3058] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.497674 4488 raft_consensus.cc:2466] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 06e82aae52a24d6db2e51581ee7ca9ed in term 1.
I20250902 21:32:01.497761 4619 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "32d5814bb7e34ba8a525018a3d441fc7" candidate_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c"
I20250902 21:32:01.497815 4684 leader_election.cc:304] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 06e82aae52a24d6db2e51581ee7ca9ed, 4dd1d49578df44b0a0325def79d07969; no voters:
I20250902 21:32:01.497841 4619 raft_consensus.cc:3058] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.497889 4834 raft_consensus.cc:2802] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Leader election won for term 1
I20250902 21:32:01.497934 4834 raft_consensus.cc:695] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 LEADER]: Becoming Leader. State: Replica: 06e82aae52a24d6db2e51581ee7ca9ed, State: Running, Role: LEADER
I20250902 21:32:01.498010 4834 consensus_queue.cc:237] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.498464 4619 raft_consensus.cc:2466] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 06e82aae52a24d6db2e51581ee7ca9ed in term 1.
I20250902 21:32:01.498884 4345 catalog_manager.cc:5582] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed reported cstate change: term changed from 0 to 1, leader changed from <none> to 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195). New cstate: current_term: 1 leader_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: HEALTHY } } }
I20250902 21:32:01.499838 4826 raft_consensus.cc:491] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:01.499886 4826 raft_consensus.cc:513] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.500001 4826 leader_election.cc:290] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:01.500142 4619 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "b3f81968196f41b3a5e3bead654d937a" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c" is_pre_election: true
I20250902 21:32:01.500200 4619 raft_consensus.cc:2466] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 4dd1d49578df44b0a0325def79d07969 in term 0.
I20250902 21:32:01.500196 4750 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "b3f81968196f41b3a5e3bead654d937a" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" is_pre_election: true
I20250902 21:32:01.500258 4750 raft_consensus.cc:2466] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 4dd1d49578df44b0a0325def79d07969 in term 0.
I20250902 21:32:01.500365 4421 leader_election.cc:304] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 06e82aae52a24d6db2e51581ee7ca9ed, 4dd1d49578df44b0a0325def79d07969; no voters:
I20250902 21:32:01.500447 4826 raft_consensus.cc:2802] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250902 21:32:01.500509 4826 raft_consensus.cc:491] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:01.500540 4826 raft_consensus.cc:3058] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.500978 4826 raft_consensus.cc:513] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.501070 4826 leader_election.cc:290] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 1 election: Requested vote from peers 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:01.501227 4750 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "b3f81968196f41b3a5e3bead654d937a" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed"
I20250902 21:32:01.501226 4619 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "b3f81968196f41b3a5e3bead654d937a" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c"
I20250902 21:32:01.501281 4619 raft_consensus.cc:3058] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.501282 4750 raft_consensus.cc:3058] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.501711 4619 raft_consensus.cc:2466] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 4dd1d49578df44b0a0325def79d07969 in term 1.
I20250902 21:32:01.501720 4750 raft_consensus.cc:2466] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 4dd1d49578df44b0a0325def79d07969 in term 1.
I20250902 21:32:01.501852 4421 leader_election.cc:304] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 4dd1d49578df44b0a0325def79d07969, 9a6e0058b145476b9c65606690bad44c; no voters:
I20250902 21:32:01.501933 4826 raft_consensus.cc:2802] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Leader election won for term 1
I20250902 21:32:01.502074 4826 raft_consensus.cc:695] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 1 LEADER]: Becoming Leader. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Running, Role: LEADER
I20250902 21:32:01.502149 4826 consensus_queue.cc:237] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.502826 4345 catalog_manager.cc:5582] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 reported cstate change: term changed from 0 to 1, leader changed from <none> to 4dd1d49578df44b0a0325def79d07969 (127.4.52.193). New cstate: current_term: 1 leader_uuid: "4dd1d49578df44b0a0325def79d07969" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:01.511758 4834 raft_consensus.cc:491] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:01.511830 4834 raft_consensus.cc:513] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.511963 4834 leader_election.cc:290] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549), 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533)
I20250902 21:32:01.512089 4838 raft_consensus.cc:491] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:01.512115 4488 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "2f100030f5ec42f085bf83f379ebb850" candidate_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "4dd1d49578df44b0a0325def79d07969" is_pre_election: true
I20250902 21:32:01.512181 4488 raft_consensus.cc:2466] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 06e82aae52a24d6db2e51581ee7ca9ed in term 0.
I20250902 21:32:01.512145 4619 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "2f100030f5ec42f085bf83f379ebb850" candidate_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c" is_pre_election: true
I20250902 21:32:01.512149 4838 raft_consensus.cc:513] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.512249 4619 raft_consensus.cc:2466] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 06e82aae52a24d6db2e51581ee7ca9ed in term 0.
I20250902 21:32:01.512328 4684 leader_election.cc:304] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 06e82aae52a24d6db2e51581ee7ca9ed, 4dd1d49578df44b0a0325def79d07969; no voters:
I20250902 21:32:01.512383 4838 leader_election.cc:290] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:01.512463 4834 raft_consensus.cc:2802] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250902 21:32:01.512511 4834 raft_consensus.cc:491] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:01.512507 4488 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "8c955e3402124b248b2740afe7cdff4d" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "4dd1d49578df44b0a0325def79d07969" is_pre_election: true
I20250902 21:32:01.512586 4488 raft_consensus.cc:2466] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 0.
I20250902 21:32:01.512563 4750 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "8c955e3402124b248b2740afe7cdff4d" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" is_pre_election: true
I20250902 21:32:01.512686 4750 raft_consensus.cc:2466] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 0.
I20250902 21:32:01.512712 4553 leader_election.cc:304] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 4dd1d49578df44b0a0325def79d07969, 9a6e0058b145476b9c65606690bad44c; no voters:
I20250902 21:32:01.512554 4834 raft_consensus.cc:3058] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.512784 4838 raft_consensus.cc:2802] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250902 21:32:01.512828 4838 raft_consensus.cc:491] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:01.512851 4838 raft_consensus.cc:3058] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.513315 4834 raft_consensus.cc:513] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.513337 4838 raft_consensus.cc:513] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.513407 4834 leader_election.cc:290] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [CANDIDATE]: Term 1 election: Requested vote from peers 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549), 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533)
I20250902 21:32:01.513440 4838 leader_election.cc:290] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 election: Requested vote from peers 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:01.513612 4488 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "2f100030f5ec42f085bf83f379ebb850" candidate_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "4dd1d49578df44b0a0325def79d07969"
I20250902 21:32:01.513612 4750 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "8c955e3402124b248b2740afe7cdff4d" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed"
I20250902 21:32:01.513667 4750 raft_consensus.cc:3058] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.513654 4619 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "2f100030f5ec42f085bf83f379ebb850" candidate_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c"
I20250902 21:32:01.513666 4487 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "8c955e3402124b248b2740afe7cdff4d" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "4dd1d49578df44b0a0325def79d07969"
I20250902 21:32:01.513716 4619 raft_consensus.cc:3058] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.513725 4487 raft_consensus.cc:3058] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.514115 4750 raft_consensus.cc:2466] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 1.
I20250902 21:32:01.514274 4619 raft_consensus.cc:2466] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 06e82aae52a24d6db2e51581ee7ca9ed in term 1.
I20250902 21:32:01.514283 4552 leader_election.cc:304] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 06e82aae52a24d6db2e51581ee7ca9ed, 9a6e0058b145476b9c65606690bad44c; no voters:
I20250902 21:32:01.514380 4838 raft_consensus.cc:2802] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Leader election won for term 1
I20250902 21:32:01.514420 4838 raft_consensus.cc:695] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 1 LEADER]: Becoming Leader. State: Replica: 9a6e0058b145476b9c65606690bad44c, State: Running, Role: LEADER
I20250902 21:32:01.514439 4487 raft_consensus.cc:2466] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 1.
I20250902 21:32:01.514465 4683 leader_election.cc:304] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 06e82aae52a24d6db2e51581ee7ca9ed, 9a6e0058b145476b9c65606690bad44c; no voters:
I20250902 21:32:01.514463 4838 consensus_queue.cc:237] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.514539 4834 raft_consensus.cc:2802] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Leader election won for term 1
I20250902 21:32:01.514570 4834 raft_consensus.cc:695] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 LEADER]: Becoming Leader. State: Replica: 06e82aae52a24d6db2e51581ee7ca9ed, State: Running, Role: LEADER
I20250902 21:32:01.513667 4488 raft_consensus.cc:3058] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.514605 4834 consensus_queue.cc:237] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:01.515165 4488 raft_consensus.cc:2466] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 06e82aae52a24d6db2e51581ee7ca9ed in term 1.
I20250902 21:32:01.515105 4344 catalog_manager.cc:5582] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed reported cstate change: term changed from 0 to 1, leader changed from <none> to 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195). New cstate: current_term: 1 leader_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: HEALTHY } } }
I20250902 21:32:01.515435 4345 catalog_manager.cc:5582] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c reported cstate change: term changed from 0 to 1, leader changed from <none> to 9a6e0058b145476b9c65606690bad44c (127.4.52.194). New cstate: current_term: 1 leader_uuid: "9a6e0058b145476b9c65606690bad44c" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:01.554473 4838 consensus_queue.cc:1035] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [LEADER]: Connected to new peer: Peer: permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.554821 4838 raft_consensus.cc:491] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:01.554888 4838 raft_consensus.cc:513] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } }
I20250902 21:32:01.554996 4838 leader_election.cc:290] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693), 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549)
W20250902 21:32:01.555107 4666 tablet.cc:2378] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250902 21:32:01.555199 4487 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "71a6015393b644f5abbd15f20c69a5ec" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "4dd1d49578df44b0a0325def79d07969" is_pre_election: true
I20250902 21:32:01.555181 4750 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "71a6015393b644f5abbd15f20c69a5ec" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" is_pre_election: true
I20250902 21:32:01.555261 4487 raft_consensus.cc:2466] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 0.
I20250902 21:32:01.555274 4750 raft_consensus.cc:2466] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 0.
I20250902 21:32:01.555378 4553 leader_election.cc:304] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 4dd1d49578df44b0a0325def79d07969, 9a6e0058b145476b9c65606690bad44c; no voters:
I20250902 21:32:01.555490 4838 raft_consensus.cc:2802] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250902 21:32:01.555550 4838 raft_consensus.cc:491] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:01.555572 4838 raft_consensus.cc:3058] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.556205 4838 raft_consensus.cc:513] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } }
I20250902 21:32:01.556318 4838 leader_election.cc:290] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 election: Requested vote from peers 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693), 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549)
I20250902 21:32:01.556447 4750 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "71a6015393b644f5abbd15f20c69a5ec" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed"
I20250902 21:32:01.556483 4488 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "71a6015393b644f5abbd15f20c69a5ec" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "4dd1d49578df44b0a0325def79d07969"
I20250902 21:32:01.556537 4750 raft_consensus.cc:3058] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.556550 4488 raft_consensus.cc:3058] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969 [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:01.556782 4838 consensus_queue.cc:1035] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [LEADER]: Connected to new peer: Peer: permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.557214 4750 raft_consensus.cc:2466] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 1.
I20250902 21:32:01.557251 4488 raft_consensus.cc:2466] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 1.
I20250902 21:32:01.557433 4553 leader_election.cc:304] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 4dd1d49578df44b0a0325def79d07969, 9a6e0058b145476b9c65606690bad44c; no voters:
I20250902 21:32:01.557516 4838 raft_consensus.cc:2802] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Leader election won for term 1
I20250902 21:32:01.557547 4838 raft_consensus.cc:695] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 1 LEADER]: Becoming Leader. State: Replica: 9a6e0058b145476b9c65606690bad44c, State: Running, Role: LEADER
I20250902 21:32:01.557581 4838 consensus_queue.cc:237] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } }
I20250902 21:32:01.558028 4345 catalog_manager.cc:5582] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c reported cstate change: term changed from 0 to 1, leader changed from <none> to 9a6e0058b145476b9c65606690bad44c (127.4.52.194). New cstate: current_term: 1 leader_uuid: "9a6e0058b145476b9c65606690bad44c" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:01.583748 4834 consensus_queue.cc:1035] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Connected to new peer: Peer: permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.585186 4858 consensus_queue.cc:1035] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Connected to new peer: Peer: permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.587299 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.196:0
--local_ip_for_outbound_sockets=127.4.52.196
--webserver_interface=127.4.52.196
--webserver_port=0
--tserver_master_addrs=127.4.52.254:35631
--builtin_ntp_servers=127.4.52.212:36103
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20250902 21:32:01.592626 4858 consensus_queue.cc:1035] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Connected to new peer: Peer: permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.593883 4827 consensus_queue.cc:1035] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Connected to new peer: Peer: permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.600888 4750 raft_consensus.cc:1273] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Refusing update from remote peer 4dd1d49578df44b0a0325def79d07969: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 5. (index mismatch)
I20250902 21:32:01.601099 4619 raft_consensus.cc:1273] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Refusing update from remote peer 4dd1d49578df44b0a0325def79d07969: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 5. (index mismatch)
I20250902 21:32:01.601514 4846 consensus_queue.cc:1035] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [LEADER]: Connected to new peer: Peer: permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.601676 4846 consensus_queue.cc:1035] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [LEADER]: Connected to new peer: Peer: permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.601953 4749 raft_consensus.cc:1273] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Refusing update from remote peer 9a6e0058b145476b9c65606690bad44c: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 5. (index mismatch)
I20250902 21:32:01.602119 4487 raft_consensus.cc:1273] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Refusing update from remote peer 9a6e0058b145476b9c65606690bad44c: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 4. (index mismatch)
I20250902 21:32:01.602362 4821 consensus_queue.cc:1035] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [LEADER]: Connected to new peer: Peer: permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.602999 4750 raft_consensus.cc:1273] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Refusing update from remote peer 9a6e0058b145476b9c65606690bad44c: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 5. (index mismatch)
I20250902 21:32:01.602999 4488 raft_consensus.cc:1273] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Refusing update from remote peer 9a6e0058b145476b9c65606690bad44c: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 5. (index mismatch)
I20250902 21:32:01.603080 4821 consensus_queue.cc:1035] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [LEADER]: Connected to new peer: Peer: permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.603155 4821 consensus_queue.cc:1035] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [LEADER]: Connected to new peer: Peer: permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.603641 4838 consensus_queue.cc:1035] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [LEADER]: Connected to new peer: Peer: permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:01.609272 4876 mvcc.cc:204] Tried to move back new op lower bound from 7196052363680956416 to 7196052363323498496. Current Snapshot: MvccSnapshot[applied={T|T < 7196052363675406336}]
I20250902 21:32:01.609576 4905 mvcc.cc:204] Tried to move back new op lower bound from 7196052363680956416 to 7196052363323498496. Current Snapshot: MvccSnapshot[applied={T|T < 7196052363675406336}]
I20250902 21:32:01.611553 4853 mvcc.cc:204] Tried to move back new op lower bound from 7196052363680956416 to 7196052363323498496. Current Snapshot: MvccSnapshot[applied={T|T < 7196052363677335552}]
W20250902 21:32:01.771329 4864 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:01.771600 4864 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:01.771648 4864 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20250902 21:32:01.771669 4864 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:01.774194 4864 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:01.774295 4864 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.196
I20250902 21:32:01.776434 4864 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:36103
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.196:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.4.52.196
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.4.52.254:35631
--never_fsync=true
--heap_profile_path=/tmp/kudu.4864
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.196
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:01.776902 4864 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:01.777263 4864 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:01.780822 4939 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:01.781003 4937 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:01.785254 4864 server_base.cc:1047] running on GCE node
W20250902 21:32:01.792973 4936 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:01.793730 4864 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:01.793982 4864 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:01.801488 4864 hybrid_clock.cc:648] HybridClock initialized: now 1756848721801465 us; error 36 us; skew 500 ppm
I20250902 21:32:01.803175 4864 webserver.cc:480] Webserver started at http://127.4.52.196:34911/ using document root <none> and password file <none>
I20250902 21:32:01.803524 4864 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:01.803637 4864 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:01.803833 4864 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20250902 21:32:01.805583 4864 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/data/instance:
uuid: "612f3c479a8a463b9f14f5ddab593fa5"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.806159 4864 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/wal/instance:
uuid: "612f3c479a8a463b9f14f5ddab593fa5"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.808051 4864 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:01.808972 4945 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.809144 4864 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:01.809221 4864 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/wal
uuid: "612f3c479a8a463b9f14f5ddab593fa5"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:01.809274 4864 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:01.828656 4864 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:01.829396 4864 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:01.829661 4864 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:01.829946 4864 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:01.830487 4864 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:01.830587 4864 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.830686 4864 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:01.830797 4864 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:01.839819 4864 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.196:43591
I20250902 21:32:01.840201 4864 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
I20250902 21:32:01.840596 5058 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.196:43591 every 8 connection(s)
I20250902 21:32:01.846446 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 4864
I20250902 21:32:01.846513 4307 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-3/wal/instance
I20250902 21:32:01.869751 5059 heartbeater.cc:344] Connected to a master server at 127.4.52.254:35631
I20250902 21:32:01.870026 5059 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:01.870354 5059 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:01.871466 4345 ts_manager.cc:194] Registered new tserver with Master: 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196:43591)
I20250902 21:32:01.871960 4345 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.196:46201
I20250902 21:32:01.977891 4345 ts_manager.cc:295] Set tserver state for 4dd1d49578df44b0a0325def79d07969 to MAINTENANCE_MODE
I20250902 21:32:01.978150 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 4406
W20250902 21:32:01.985369 4684 connection.cc:537] server connection from 127.4.52.193:42791 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:01.985352 4805 connection.cc:537] client connection to 127.4.52.193:41549 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:01.985489 4684 connection.cc:537] client connection to 127.4.52.193:41549 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:01.985481 4805 meta_cache.cc:302] tablet b3f81968196f41b3a5e3bead654d937a: replica 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549) has failed: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:01.985538 4684 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:01.985515 4553 connection.cc:537] client connection to 127.4.52.193:41549 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:01.985597 4553 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:01.985983 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:01.986040 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:01.986029 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:01.986071 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:01.986094 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:01.986227 4554 connection.cc:537] server connection from 127.4.52.193:43163 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:01.990056 4579 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:55172: Illegal state: replica 9a6e0058b145476b9c65606690bad44c is not leader of this config: current role FOLLOWER
W20250902 21:32:01.993243 4709 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37878: Illegal state: replica 06e82aae52a24d6db2e51581ee7ca9ed is not leader of this config: current role FOLLOWER
W20250902 21:32:02.006848 4579 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:55172: Illegal state: replica 9a6e0058b145476b9c65606690bad44c is not leader of this config: current role FOLLOWER
W20250902 21:32:02.014427 4709 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37878: Illegal state: replica 06e82aae52a24d6db2e51581ee7ca9ed is not leader of this config: current role FOLLOWER
W20250902 21:32:02.033833 4579 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:55172: Illegal state: replica 9a6e0058b145476b9c65606690bad44c is not leader of this config: current role FOLLOWER
W20250902 21:32:02.045619 4709 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37878: Illegal state: replica 06e82aae52a24d6db2e51581ee7ca9ed is not leader of this config: current role FOLLOWER
W20250902 21:32:02.067876 4579 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:55172: Illegal state: replica 9a6e0058b145476b9c65606690bad44c is not leader of this config: current role FOLLOWER
W20250902 21:32:02.080345 4709 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37878: Illegal state: replica 06e82aae52a24d6db2e51581ee7ca9ed is not leader of this config: current role FOLLOWER
W20250902 21:32:02.110814 4579 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:55172: Illegal state: replica 9a6e0058b145476b9c65606690bad44c is not leader of this config: current role FOLLOWER
W20250902 21:32:02.129539 4709 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37878: Illegal state: replica 06e82aae52a24d6db2e51581ee7ca9ed is not leader of this config: current role FOLLOWER
W20250902 21:32:02.166136 4579 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:55172: Illegal state: replica 9a6e0058b145476b9c65606690bad44c is not leader of this config: current role FOLLOWER
W20250902 21:32:02.186802 4709 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37878: Illegal state: replica 06e82aae52a24d6db2e51581ee7ca9ed is not leader of this config: current role FOLLOWER
W20250902 21:32:02.230321 4579 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:55172: Illegal state: replica 9a6e0058b145476b9c65606690bad44c is not leader of this config: current role FOLLOWER
W20250902 21:32:02.254211 4709 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37878: Illegal state: replica 06e82aae52a24d6db2e51581ee7ca9ed is not leader of this config: current role FOLLOWER
I20250902 21:32:02.280284 4858 raft_consensus.cc:491] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 4dd1d49578df44b0a0325def79d07969)
I20250902 21:32:02.280376 4858 raft_consensus.cc:513] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:02.280524 4858 leader_election.cc:290] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549), 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533)
I20250902 21:32:02.280728 4617 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "b3f81968196f41b3a5e3bead654d937a" candidate_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" candidate_term: 2 candidate_status { last_received { term: 1 index: 129 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c" is_pre_election: true
W20250902 21:32:02.280972 4684 leader_election.cc:336] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111)
I20250902 21:32:02.281024 4684 leader_election.cc:304] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 06e82aae52a24d6db2e51581ee7ca9ed; no voters: 4dd1d49578df44b0a0325def79d07969, 9a6e0058b145476b9c65606690bad44c
I20250902 21:32:02.281087 4858 raft_consensus.cc:2747] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20250902 21:32:02.281934 4890 raft_consensus.cc:491] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 4dd1d49578df44b0a0325def79d07969)
I20250902 21:32:02.281985 4890 raft_consensus.cc:513] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:02.282104 4890 leader_election.cc:290] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:02.282258 4747 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "b3f81968196f41b3a5e3bead654d937a" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 2 candidate_status { last_received { term: 1 index: 130 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" is_pre_election: true
I20250902 21:32:02.282334 4747 raft_consensus.cc:2466] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 1.
I20250902 21:32:02.282480 4552 leader_election.cc:304] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 06e82aae52a24d6db2e51581ee7ca9ed, 9a6e0058b145476b9c65606690bad44c; no voters:
I20250902 21:32:02.282562 4890 raft_consensus.cc:2802] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Leader pre-election won for term 2
W20250902 21:32:02.282570 4553 leader_election.cc:336] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111)
I20250902 21:32:02.282605 4890 raft_consensus.cc:491] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Starting leader election (detected failure of leader 4dd1d49578df44b0a0325def79d07969)
I20250902 21:32:02.282637 4890 raft_consensus.cc:3058] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Advancing to term 2
I20250902 21:32:02.283283 4890 raft_consensus.cc:513] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:02.283406 4890 leader_election.cc:290] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 2 election: Requested vote from peers 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:02.283555 4747 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "b3f81968196f41b3a5e3bead654d937a" candidate_uuid: "9a6e0058b145476b9c65606690bad44c" candidate_term: 2 candidate_status { last_received { term: 1 index: 130 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed"
I20250902 21:32:02.283615 4747 raft_consensus.cc:3058] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Advancing to term 2
W20250902 21:32:02.283804 4553 leader_election.cc:336] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111)
I20250902 21:32:02.284214 4747 raft_consensus.cc:2466] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 9a6e0058b145476b9c65606690bad44c in term 2.
I20250902 21:32:02.284376 4552 leader_election.cc:304] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 06e82aae52a24d6db2e51581ee7ca9ed, 9a6e0058b145476b9c65606690bad44c; no voters: 4dd1d49578df44b0a0325def79d07969
I20250902 21:32:02.284456 4890 raft_consensus.cc:2802] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 2 FOLLOWER]: Leader election won for term 2
I20250902 21:32:02.284503 4890 raft_consensus.cc:695] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 2 LEADER]: Becoming Leader. State: Replica: 9a6e0058b145476b9c65606690bad44c, State: Running, Role: LEADER
I20250902 21:32:02.284554 4890 consensus_queue.cc:237] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 127, Committed index: 127, Last appended: 1.130, Last appended by leader: 130, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:02.285007 4345 catalog_manager.cc:5582] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c reported cstate change: term changed from 1 to 2, leader changed from 4dd1d49578df44b0a0325def79d07969 (127.4.52.193) to 9a6e0058b145476b9c65606690bad44c (127.4.52.194). New cstate: current_term: 2 leader_uuid: "9a6e0058b145476b9c65606690bad44c" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:02.302016 4747 raft_consensus.cc:1273] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 2 FOLLOWER]: Refusing update from remote peer 9a6e0058b145476b9c65606690bad44c: Log matching property violated. Preceding OpId in replica: term: 1 index: 129. Preceding OpId from leader: term: 2 index: 132. (index mismatch)
I20250902 21:32:02.302325 4884 consensus_queue.cc:1035] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [LEADER]: Connected to new peer: Peer: permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 131, Last known committed idx: 126, Time since last communication: 0.000s
W20250902 21:32:02.302429 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:02.447872 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20250902 21:32:02.467581 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20250902 21:32:02.470685 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20250902 21:32:02.479868 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20250902 21:32:02.499610 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20250902 21:32:02.787932 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
I20250902 21:32:02.873390 5059 heartbeater.cc:499] Master 127.4.52.254:35631 was elected leader, sending a full tablet report...
W20250902 21:32:02.932278 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:02.955921 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:02.967026 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:02.986497 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:02.989995 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:03.260396 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:03.421486 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:03.467720 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:03.475569 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:03.475558 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:03.517066 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:03.773344 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:03.911339 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20250902 21:32:03.958082 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20250902 21:32:03.976317 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20250902 21:32:04.005872 4927 consensus_queue.cc:579] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.029s)
I20250902 21:32:04.035409 5079 consensus_queue.cc:579] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.055s)
I20250902 21:32:04.039744 5068 consensus_queue.cc:579] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.062s)
W20250902 21:32:04.041000 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20250902 21:32:04.081801 4922 consensus_queue.cc:579] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.105s)
I20250902 21:32:04.081851 5068 consensus_queue.cc:579] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.104s)
W20250902 21:32:04.086077 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20250902 21:32:04.286947 4884 consensus_queue.cc:579] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.002s)
W20250902 21:32:04.293520 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20250902 21:32:04.421842 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20250902 21:32:04.437067 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20250902 21:32:04.445508 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20250902 21:32:04.513294 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20250902 21:32:04.618527 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20250902 21:32:04.780759 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20250902 21:32:04.907996 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20250902 21:32:04.969308 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20250902 21:32:05.004092 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 4315
I20250902 21:32:05.010658 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.4.52.254:35631
--webserver_interface=127.4.52.254
--webserver_port=44043
--builtin_ntp_servers=127.4.52.212:36103
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.4.52.254:35631 with env {}
W20250902 21:32:05.013376 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20250902 21:32:05.043912 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20250902 21:32:05.083957 4796 heartbeater.cc:646] Failed to heartbeat to 127.4.52.254:35631 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.4.52.254:35631: connect: Connection refused (error 111)
I20250902 21:32:05.122166 5079 consensus_queue.cc:786] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 18 ops behind the committed index
I20250902 21:32:05.126384 5068 consensus_queue.cc:786] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 16 ops behind the committed index
I20250902 21:32:05.127691 5090 consensus_queue.cc:786] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 14 ops behind the committed index
I20250902 21:32:05.153520 4827 consensus_queue.cc:786] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 22 ops behind the committed index
I20250902 21:32:05.160786 5088 consensus_queue.cc:786] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 16 ops behind the committed index
W20250902 21:32:05.170980 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20250902 21:32:05.194254 5096 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:05.194567 5096 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:05.194614 5096 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:05.196733 5096 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250902 21:32:05.196822 5096 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:05.196873 5096 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250902 21:32:05.196950 5096 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250902 21:32:05.199191 5096 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:36103
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.4.52.254:35631
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.4.52.254:35631
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.4.52.254
--webserver_port=44043
--never_fsync=true
--heap_profile_path=/tmp/kudu.5096
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:05.199648 5096 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:05.200002 5096 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:05.205699 5102 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:05.206158 5101 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:05.206177 5104 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:05.207206 5096 server_base.cc:1047] running on GCE node
I20250902 21:32:05.207423 5096 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:05.207732 5096 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:05.211187 5096 hybrid_clock.cc:648] HybridClock initialized: now 1756848725210362 us; error 838 us; skew 500 ppm
I20250902 21:32:05.213119 5096 webserver.cc:480] Webserver started at http://127.4.52.254:44043/ using document root <none> and password file <none>
I20250902 21:32:05.213490 5096 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:05.213622 5096 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:05.216650 5096 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:05.217196 5110 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:05.217350 5096 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:05.217412 5096 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal
uuid: "3323a6faa7434a218c3dae2aa20a5e9b"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:05.217660 5096 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:05.218410 5086 consensus_queue.cc:786] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 36 ops behind the committed index
I20250902 21:32:05.243059 5096 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:05.243600 5096 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:05.243806 5096 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:05.248696 5096 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.254:35631
I20250902 21:32:05.248720 5162 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.254:35631 every 8 connection(s)
I20250902 21:32:05.249081 5096 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/master-0/data/info.pb
I20250902 21:32:05.250135 5163 sys_catalog.cc:263] Verifying existing consensus state
I20250902 21:32:05.250686 5163 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: Bootstrap starting.
I20250902 21:32:05.258651 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 5096
I20250902 21:32:05.261607 5163 log.cc:826] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:05.264240 5163 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: Bootstrap replayed 1/1 log segments. Stats: ops{read=14 overwritten=0 applied=14 ignored=0} inserts{seen=11 ignored=0} mutations{seen=13 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250902 21:32:05.264642 5163 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: Bootstrap complete.
I20250902 21:32:05.266685 5163 raft_consensus.cc:357] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } }
I20250902 21:32:05.266965 5163 raft_consensus.cc:738] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3323a6faa7434a218c3dae2aa20a5e9b, State: Initialized, Role: FOLLOWER
I20250902 21:32:05.267264 5163 consensus_queue.cc:260] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14, Last appended: 1.14, Last appended by leader: 14, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } }
I20250902 21:32:05.267341 5163 raft_consensus.cc:397] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250902 21:32:05.267375 5163 raft_consensus.cc:491] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250902 21:32:05.267416 5163 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 1 FOLLOWER]: Advancing to term 2
I20250902 21:32:05.268483 5163 raft_consensus.cc:513] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } }
I20250902 21:32:05.268612 5163 leader_election.cc:304] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 3323a6faa7434a218c3dae2aa20a5e9b; no voters:
I20250902 21:32:05.268784 5163 leader_election.cc:290] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [CANDIDATE]: Term 2 election: Requested vote from peers
I20250902 21:32:05.268939 5167 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 2 FOLLOWER]: Leader election won for term 2
I20250902 21:32:05.268996 5163 sys_catalog.cc:564] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [sys.catalog]: configured and running, proceeding with master startup.
I20250902 21:32:05.269083 5167 raft_consensus.cc:695] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [term 2 LEADER]: Becoming Leader. State: Replica: 3323a6faa7434a218c3dae2aa20a5e9b, State: Running, Role: LEADER
I20250902 21:32:05.269174 5167 consensus_queue.cc:237] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 14, Committed index: 14, Last appended: 1.14, Last appended by leader: 14, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } }
I20250902 21:32:05.269410 5167 sys_catalog.cc:455] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } } }
I20250902 21:32:05.269480 5167 sys_catalog.cc:458] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [sys.catalog]: This master's current role is: LEADER
I20250902 21:32:05.269591 5167 sys_catalog.cc:455] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [sys.catalog]: SysCatalogTable state changed. Reason: New leader 3323a6faa7434a218c3dae2aa20a5e9b. Latest consensus state: current_term: 2 leader_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3323a6faa7434a218c3dae2aa20a5e9b" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 35631 } } }
I20250902 21:32:05.269646 5167 sys_catalog.cc:458] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b [sys.catalog]: This master's current role is: LEADER
I20250902 21:32:05.271811 5181 catalog_manager.cc:1261] Loaded cluster ID: 8d38126b93b947b29dc440ea611353c0
I20250902 21:32:05.271895 5181 catalog_manager.cc:1554] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: loading cluster ID for follower catalog manager: success
I20250902 21:32:05.273106 5181 catalog_manager.cc:1576] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: acquiring CA information for follower catalog manager: success
I20250902 21:32:05.273483 5181 catalog_manager.cc:1604] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: importing token verification keys for follower catalog manager: success; most recent TSK sequence number 0
I20250902 21:32:05.273859 5170 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250902 21:32:05.274149 5170 catalog_manager.cc:671] Loaded metadata for table test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8]
I20250902 21:32:05.274610 5170 tablet_loader.cc:96] loaded metadata for tablet 2f100030f5ec42f085bf83f379ebb850 (table test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8])
I20250902 21:32:05.274737 5170 tablet_loader.cc:96] loaded metadata for tablet 32d5814bb7e34ba8a525018a3d441fc7 (table test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8])
I20250902 21:32:05.274870 5170 tablet_loader.cc:96] loaded metadata for tablet 71a6015393b644f5abbd15f20c69a5ec (table test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8])
I20250902 21:32:05.274950 5170 tablet_loader.cc:96] loaded metadata for tablet 85463c5915304980b0a2aba153ed3da0 (table test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8])
I20250902 21:32:05.275028 5170 tablet_loader.cc:96] loaded metadata for tablet 8c955e3402124b248b2740afe7cdff4d (table test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8])
I20250902 21:32:05.275108 5170 tablet_loader.cc:96] loaded metadata for tablet b3f81968196f41b3a5e3bead654d937a (table test-workload [id=fe8c3f057d5241579c8454bc9b3cfdd8])
I20250902 21:32:05.275179 5170 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250902 21:32:05.275271 5170 catalog_manager.cc:1261] Loaded cluster ID: 8d38126b93b947b29dc440ea611353c0
I20250902 21:32:05.275306 5170 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250902 21:32:05.276050 5170 catalog_manager.cc:1506] Loading token signing keys...
I20250902 21:32:05.276194 5170 catalog_manager.cc:5966] T 00000000000000000000000000000000 P 3323a6faa7434a218c3dae2aa20a5e9b: Loaded TSK: 0
I20250902 21:32:05.276407 5170 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250902 21:32:05.322162 5127 master_service.cc:432] Got heartbeat from unknown tserver (permanent_uuid: "9a6e0058b145476b9c65606690bad44c" instance_seqno: 1756848721303107) as {username='slave'} at 127.4.52.194:36911; Asking this server to re-register.
I20250902 21:32:05.322501 4665 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:05.322594 4665 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:05.322986 5127 ts_manager.cc:194] Registered new tserver with Master: 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533)
W20250902 21:32:05.325479 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20250902 21:32:05.454684 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20250902 21:32:05.459508 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20250902 21:32:05.515705 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20250902 21:32:05.535245 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20250902 21:32:05.683699 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20250902 21:32:05.853647 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20250902 21:32:05.897116 5127 master_service.cc:432] Got heartbeat from unknown tserver (permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" instance_seqno: 1756848721836767) as {username='slave'} at 127.4.52.196:44895; Asking this server to re-register.
I20250902 21:32:05.897442 5059 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:05.897528 5059 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:05.897784 5127 ts_manager.cc:194] Registered new tserver with Master: 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196:43591)
W20250902 21:32:05.980795 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20250902 21:32:05.984006 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20250902 21:32:06.015702 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20250902 21:32:06.064275 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
I20250902 21:32:06.087376 4796 heartbeater.cc:344] Connected to a master server at 127.4.52.254:35631
I20250902 21:32:06.087774 5127 master_service.cc:432] Got heartbeat from unknown tserver (permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" instance_seqno: 1756848721413689) as {username='slave'} at 127.4.52.195:36409; Asking this server to re-register.
I20250902 21:32:06.088059 4796 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:06.088129 4796 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:06.088701 5127 ts_manager.cc:194] Registered new tserver with Master: 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
W20250902 21:32:06.171360 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20250902 21:32:06.373061 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20250902 21:32:06.436308 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20250902 21:32:06.467173 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20250902 21:32:06.482645 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20250902 21:32:06.526959 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20250902 21:32:06.718102 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20250902 21:32:06.887337 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20250902 21:32:06.994005 4553 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111) [suppressed 197 similar messages]
W20250902 21:32:06.995903 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20250902 21:32:06.999256 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20250902 21:32:07.002584 4684 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111) [suppressed 98 similar messages]
W20250902 21:32:07.003080 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20250902 21:32:07.026134 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20250902 21:32:07.187446 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20250902 21:32:07.439067 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20250902 21:32:07.500963 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20250902 21:32:07.521278 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20250902 21:32:07.521837 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20250902 21:32:07.536854 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20250902 21:32:07.717496 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20250902 21:32:07.919097 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20250902 21:32:08.056411 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20250902 21:32:08.071769 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20250902 21:32:08.074826 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20250902 21:32:08.084877 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
I20250902 21:32:08.133755 5068 consensus_queue.cc:786] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 1445 ops behind the committed index [suppressed 29 similar messages]
I20250902 21:32:08.177474 5195 consensus_queue.cc:786] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 1477 ops behind the committed index [suppressed 29 similar messages]
I20250902 21:32:08.199457 5071 consensus_queue.cc:786] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 1484 ops behind the committed index [suppressed 28 similar messages]
I20250902 21:32:08.225235 5195 consensus_queue.cc:786] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 1480 ops behind the committed index [suppressed 28 similar messages]
I20250902 21:32:08.252339 5187 consensus_queue.cc:786] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 1498 ops behind the committed index [suppressed 29 similar messages]
I20250902 21:32:08.256500 5186 consensus_queue.cc:786] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 1511 ops behind the committed index [suppressed 29 similar messages]
W20250902 21:32:08.263020 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
I20250902 21:32:08.264501 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.193:41549
--local_ip_for_outbound_sockets=127.4.52.193
--tserver_master_addrs=127.4.52.254:35631
--webserver_port=40731
--webserver_interface=127.4.52.193
--builtin_ntp_servers=127.4.52.212:36103
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20250902 21:32:08.413784 5204 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:08.414008 5204 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:08.414033 5204 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20250902 21:32:08.414059 5204 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:08.415472 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20250902 21:32:08.415925 5204 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:08.415982 5204 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.193
I20250902 21:32:08.428815 5204 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:36103
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.193:41549
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.4.52.193
--webserver_port=40731
--enable_log_gc=false
--tserver_master_addrs=127.4.52.254:35631
--never_fsync=true
--heap_profile_path=/tmp/kudu.5204
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.193
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:08.429397 5204 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:08.430401 5204 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:08.433502 5212 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:08.433892 5210 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:08.434541 5209 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:08.435242 5204 server_base.cc:1047] running on GCE node
I20250902 21:32:08.435638 5204 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:08.435899 5204 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:08.437098 5204 hybrid_clock.cc:648] HybridClock initialized: now 1756848728437075 us; error 34 us; skew 500 ppm
I20250902 21:32:08.438446 5204 webserver.cc:480] Webserver started at http://127.4.52.193:40731/ using document root <none> and password file <none>
I20250902 21:32:08.438805 5204 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:08.438879 5204 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:08.440538 5204 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:08.442307 5218 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:08.442461 5204 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:08.442529 5204 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal
uuid: "4dd1d49578df44b0a0325def79d07969"
format_stamp: "Formatted at 2025-09-02 21:32:01 on dist-test-slave-jkp9"
I20250902 21:32:08.442859 5204 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:08.463762 5204 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:08.464282 5204 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:08.464654 5204 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:08.464923 5204 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:08.474170 5225 ts_tablet_manager.cc:536] Loading tablet metadata (0/6 complete)
I20250902 21:32:08.481583 5204 ts_tablet_manager.cc:579] Loaded tablet metadata (6 total tablets, 6 live tablets)
I20250902 21:32:08.481644 5204 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.016s user 0.000s sys 0.000s
I20250902 21:32:08.481688 5204 ts_tablet_manager.cc:594] Registering tablets (0/6 complete)
I20250902 21:32:08.484035 5204 ts_tablet_manager.cc:610] Registered 6 tablets
I20250902 21:32:08.484073 5204 ts_tablet_manager.cc:589] Time spent register tablets: real 0.002s user 0.000s sys 0.002s
I20250902 21:32:08.484179 5225 tablet_bootstrap.cc:492] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
I20250902 21:32:08.493636 5204 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.193:41549
I20250902 21:32:08.493722 5332 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.193:41549 every 8 connection(s)
I20250902 21:32:08.494081 5204 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
I20250902 21:32:08.494879 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 5204
I20250902 21:32:08.511400 5333 heartbeater.cc:344] Connected to a master server at 127.4.52.254:35631
I20250902 21:32:08.511533 5333 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:08.511765 5333 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:08.512946 5127 ts_manager.cc:194] Registered new tserver with Master: 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549)
I20250902 21:32:08.513114 5225 log.cc:826] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:08.513888 5127 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.193:50577
I20250902 21:32:08.541253 5225 tablet_bootstrap.cc:492] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969: Bootstrap replayed 1/1 log segments. Stats: ops{read=127 overwritten=0 applied=126 ignored=0} inserts{seen=992 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20250902 21:32:08.541798 5225 tablet_bootstrap.cc:492] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969: Bootstrap complete.
I20250902 21:32:08.543067 5225 ts_tablet_manager.cc:1397] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.059s user 0.018s sys 0.007s
I20250902 21:32:08.544368 5225 raft_consensus.cc:357] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:08.545852 5225 raft_consensus.cc:738] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
I20250902 21:32:08.546039 5225 consensus_queue.cc:260] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 126, Last appended: 1.127, Last appended by leader: 127, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:08.546938 5225 ts_tablet_manager.cc:1428] T 85463c5915304980b0a2aba153ed3da0 P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.004s user 0.001s sys 0.002s
I20250902 21:32:08.547227 5333 heartbeater.cc:499] Master 127.4.52.254:35631 was elected leader, sending a full tablet report...
I20250902 21:32:08.548224 5225 tablet_bootstrap.cc:492] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
W20250902 21:32:08.566826 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 66: this message will repeat every 5th retry.
W20250902 21:32:08.572775 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 66: this message will repeat every 5th retry.
I20250902 21:32:08.594447 5225 tablet_bootstrap.cc:492] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969: Bootstrap replayed 1/1 log segments. Stats: ops{read=130 overwritten=0 applied=127 ignored=0} inserts{seen=1052 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:08.594858 5225 tablet_bootstrap.cc:492] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969: Bootstrap complete.
I20250902 21:32:08.595978 5225 ts_tablet_manager.cc:1397] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.048s user 0.012s sys 0.010s
I20250902 21:32:08.596184 5225 raft_consensus.cc:357] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:08.596374 5225 raft_consensus.cc:738] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
I20250902 21:32:08.596463 5225 consensus_queue.cc:260] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 127, Last appended: 1.130, Last appended by leader: 130, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:08.596618 5225 ts_tablet_manager.cc:1428] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:08.596715 5225 tablet_bootstrap.cc:492] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
W20250902 21:32:08.606360 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 66: this message will repeat every 5th retry.
I20250902 21:32:08.632040 5225 tablet_bootstrap.cc:492] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969: Bootstrap replayed 1/1 log segments. Stats: ops{read=127 overwritten=0 applied=126 ignored=0} inserts{seen=1059 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20250902 21:32:08.632453 5225 tablet_bootstrap.cc:492] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969: Bootstrap complete.
I20250902 21:32:08.633471 5225 ts_tablet_manager.cc:1397] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.037s user 0.008s sys 0.015s
I20250902 21:32:08.633671 5225 raft_consensus.cc:357] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } }
I20250902 21:32:08.633939 5225 raft_consensus.cc:738] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
I20250902 21:32:08.634048 5225 consensus_queue.cc:260] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 126, Last appended: 1.127, Last appended by leader: 127, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } }
I20250902 21:32:08.634217 5225 ts_tablet_manager.cc:1428] T 71a6015393b644f5abbd15f20c69a5ec P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:08.634310 5225 tablet_bootstrap.cc:492] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
I20250902 21:32:08.666834 5225 tablet_bootstrap.cc:492] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969: Bootstrap replayed 1/1 log segments. Stats: ops{read=130 overwritten=0 applied=130 ignored=0} inserts{seen=1070 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250902 21:32:08.667176 5225 tablet_bootstrap.cc:492] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969: Bootstrap complete.
I20250902 21:32:08.668236 5225 ts_tablet_manager.cc:1397] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.034s user 0.013s sys 0.008s
I20250902 21:32:08.668452 5225 raft_consensus.cc:357] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:08.668591 5225 raft_consensus.cc:738] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
I20250902 21:32:08.668706 5225 consensus_queue.cc:260] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 130, Last appended: 1.130, Last appended by leader: 130, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:08.668895 5225 ts_tablet_manager.cc:1428] T 2f100030f5ec42f085bf83f379ebb850 P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:08.668978 5225 tablet_bootstrap.cc:492] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
I20250902 21:32:08.711340 5225 tablet_bootstrap.cc:492] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969: Bootstrap replayed 1/1 log segments. Stats: ops{read=130 overwritten=0 applied=130 ignored=0} inserts{seen=1078 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250902 21:32:08.722395 5225 tablet_bootstrap.cc:492] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969: Bootstrap complete.
I20250902 21:32:08.731588 5225 ts_tablet_manager.cc:1397] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.063s user 0.011s sys 0.011s
I20250902 21:32:08.731870 5225 raft_consensus.cc:357] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:08.736865 5225 raft_consensus.cc:738] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
I20250902 21:32:08.737025 5225 consensus_queue.cc:260] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 130, Last appended: 1.130, Last appended by leader: 130, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:08.737278 5225 ts_tablet_manager.cc:1428] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.006s user 0.000s sys 0.000s
I20250902 21:32:08.737412 5225 tablet_bootstrap.cc:492] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969: Bootstrap starting.
I20250902 21:32:08.750381 5286 raft_consensus.cc:3058] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Advancing to term 2
I20250902 21:32:08.877219 5225 tablet_bootstrap.cc:492] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969: Bootstrap replayed 1/1 log segments. Stats: ops{read=127 overwritten=0 applied=126 ignored=0} inserts{seen=1068 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20250902 21:32:08.877650 5225 tablet_bootstrap.cc:492] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969: Bootstrap complete.
I20250902 21:32:08.883829 5225 ts_tablet_manager.cc:1397] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969: Time spent bootstrapping tablet: real 0.146s user 0.010s sys 0.013s
I20250902 21:32:08.890338 5225 raft_consensus.cc:357] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:08.890671 5225 raft_consensus.cc:738] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4dd1d49578df44b0a0325def79d07969, State: Initialized, Role: FOLLOWER
I20250902 21:32:08.890825 5225 consensus_queue.cc:260] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 126, Last appended: 1.127, Last appended by leader: 127, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:08.891062 5225 ts_tablet_manager.cc:1428] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969: Time spent starting tablet: real 0.007s user 0.000s sys 0.000s
I20250902 21:32:09.102408 5356 mvcc.cc:204] Tried to move back new op lower bound from 7196052378089340928 to 7196052366477836288. Current Snapshot: MvccSnapshot[applied={T|T < 7196052366547836928}]
W20250902 21:32:09.191917 5365 log.cc:927] Time spent T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969: Append to log took a long time: real 0.076s user 0.002s sys 0.000s
I20250902 21:32:09.224823 5368 raft_consensus.cc:491] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 06e82aae52a24d6db2e51581ee7ca9ed)
I20250902 21:32:09.229075 5368 raft_consensus.cc:513] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:09.232070 5368 leader_election.cc:290] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:09.232548 5368 raft_consensus.cc:491] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 9a6e0058b145476b9c65606690bad44c)
I20250902 21:32:09.232689 5368 raft_consensus.cc:513] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:09.232944 5368 leader_election.cc:290] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:09.244390 4748 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "32d5814bb7e34ba8a525018a3d441fc7" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 2 candidate_status { last_received { term: 1 index: 1692 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" is_pre_election: true
I20250902 21:32:09.244405 4749 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "8c955e3402124b248b2740afe7cdff4d" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 2 candidate_status { last_received { term: 1 index: 1683 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" is_pre_election: true
I20250902 21:32:09.252266 4619 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "8c955e3402124b248b2740afe7cdff4d" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 2 candidate_status { last_received { term: 1 index: 1683 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c" is_pre_election: true
I20250902 21:32:09.252266 4617 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "32d5814bb7e34ba8a525018a3d441fc7" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 2 candidate_status { last_received { term: 1 index: 1692 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c" is_pre_election: true
I20250902 21:32:09.253022 5219 leader_election.cc:304] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 4dd1d49578df44b0a0325def79d07969; no voters: 06e82aae52a24d6db2e51581ee7ca9ed, 9a6e0058b145476b9c65606690bad44c
I20250902 21:32:09.253176 5219 leader_election.cc:304] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 4dd1d49578df44b0a0325def79d07969; no voters: 06e82aae52a24d6db2e51581ee7ca9ed, 9a6e0058b145476b9c65606690bad44c
I20250902 21:32:09.253307 5368 raft_consensus.cc:2747] T 8c955e3402124b248b2740afe7cdff4d P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20250902 21:32:09.253386 5368 raft_consensus.cc:2747] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20250902 21:32:09.380926 5345 raft_consensus.cc:491] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 2 FOLLOWER]: Starting pre-election (detected failure of leader 9a6e0058b145476b9c65606690bad44c)
I20250902 21:32:09.381011 5345 raft_consensus.cc:513] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:09.381217 5345 leader_election.cc:290] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:09.382421 4747 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "b3f81968196f41b3a5e3bead654d937a" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 3 candidate_status { last_received { term: 2 index: 3260 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" is_pre_election: true
I20250902 21:32:09.382659 4618 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "b3f81968196f41b3a5e3bead654d937a" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 3 candidate_status { last_received { term: 2 index: 3260 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c" is_pre_election: true
I20250902 21:32:09.383420 5219 leader_election.cc:304] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 4dd1d49578df44b0a0325def79d07969; no voters: 06e82aae52a24d6db2e51581ee7ca9ed, 9a6e0058b145476b9c65606690bad44c
I20250902 21:32:09.384912 5345 raft_consensus.cc:2747] T b3f81968196f41b3a5e3bead654d937a P 4dd1d49578df44b0a0325def79d07969 [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20250902 21:32:09.635128 5345 raft_consensus.cc:491] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 06e82aae52a24d6db2e51581ee7ca9ed)
I20250902 21:32:09.635222 5345 raft_consensus.cc:513] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } }
I20250902 21:32:09.635391 5345 leader_election.cc:290] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533), 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:09.635605 4619 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "32d5814bb7e34ba8a525018a3d441fc7" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 2 candidate_status { last_received { term: 1 index: 3258 } } ignore_live_leader: false dest_uuid: "9a6e0058b145476b9c65606690bad44c" is_pre_election: true
I20250902 21:32:09.635721 4748 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "32d5814bb7e34ba8a525018a3d441fc7" candidate_uuid: "4dd1d49578df44b0a0325def79d07969" candidate_term: 2 candidate_status { last_received { term: 1 index: 3258 } } ignore_live_leader: false dest_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" is_pre_election: true
I20250902 21:32:09.635933 5219 leader_election.cc:304] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 4dd1d49578df44b0a0325def79d07969; no voters: 06e82aae52a24d6db2e51581ee7ca9ed, 9a6e0058b145476b9c65606690bad44c
I20250902 21:32:09.636067 5345 raft_consensus.cc:2747] T 32d5814bb7e34ba8a525018a3d441fc7 P 4dd1d49578df44b0a0325def79d07969 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20250902 21:32:09.673496 5127 ts_manager.cc:284] Unset tserver state for 4dd1d49578df44b0a0325def79d07969 from MAINTENANCE_MODE
I20250902 21:32:09.900002 5059 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:10.261722 4796 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:10.267577 4665 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:10.384208 5333 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:12.766148 5126 ts_manager.cc:295] Set tserver state for 4dd1d49578df44b0a0325def79d07969 to MAINTENANCE_MODE
I20250902 21:32:12.766443 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 5204
W20250902 21:32:12.789252 4553 connection.cc:537] client connection to 127.4.52.193:41549 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:12.789237 4684 connection.cc:537] client connection to 127.4.52.193:41549 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:12.789314 4553 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107) [suppressed 57 similar messages]
W20250902 21:32:12.789314 4684 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107) [suppressed 29 similar messages]
W20250902 21:32:12.793285 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:12.793332 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:12.793356 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:12.793382 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:12.793383 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:12.793402 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:13.221477 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20250902 21:32:13.236258 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20250902 21:32:13.236958 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20250902 21:32:13.283779 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20250902 21:32:13.285480 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20250902 21:32:13.309564 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20250902 21:32:13.739501 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:13.758605 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:13.761456 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:13.788643 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:13.793887 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:13.798267 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20250902 21:32:14.249739 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:14.270123 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:14.285315 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:14.286566 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:14.312894 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:14.338088 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20250902 21:32:14.764930 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20250902 21:32:14.765729 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20250902 21:32:14.791188 5419 consensus_queue.cc:579] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.026s)
I20250902 21:32:14.792698 5414 consensus_queue.cc:579] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.028s)
W20250902 21:32:14.797657 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20250902 21:32:14.801476 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20250902 21:32:14.809711 5370 consensus_queue.cc:579] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.046s)
W20250902 21:32:14.812803 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20250902 21:32:14.825544 5400 consensus_queue.cc:579] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.061s)
W20250902 21:32:14.831873 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20250902 21:32:14.884711 5416 consensus_queue.cc:579] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.120s)
I20250902 21:32:14.891170 5414 consensus_queue.cc:579] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [LEADER]: Leader has been unable to successfully communicate with peer 4dd1d49578df44b0a0325def79d07969 for more than 2 seconds (2.125s)
I20250902 21:32:14.980865 5127 ts_manager.cc:284] Unset tserver state for 4dd1d49578df44b0a0325def79d07969 from MAINTENANCE_MODE
W20250902 21:32:15.257727 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20250902 21:32:15.291041 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20250902 21:32:15.316286 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20250902 21:32:15.332317 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20250902 21:32:15.371435 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20250902 21:32:15.407349 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
I20250902 21:32:15.680685 5084 consensus_queue.cc:786] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 7 ops behind the committed index [suppressed 8 similar messages]
I20250902 21:32:15.707298 5398 consensus_queue.cc:786] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 14 ops behind the committed index [suppressed 6 similar messages]
I20250902 21:32:15.710843 5387 consensus_queue.cc:786] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 32 ops behind the committed index [suppressed 5 similar messages]
I20250902 21:32:15.738411 5413 consensus_queue.cc:786] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 43 ops behind the committed index [suppressed 5 similar messages]
I20250902 21:32:15.775331 5084 consensus_queue.cc:786] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 46 ops behind the committed index [suppressed 5 similar messages]
W20250902 21:32:15.788318 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20250902 21:32:15.798261 5084 consensus_queue.cc:786] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [LEADER]: Peer 4dd1d49578df44b0a0325def79d07969 is lagging by at least 67 ops behind the committed index [suppressed 6 similar messages]
W20250902 21:32:15.806420 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20250902 21:32:15.810175 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20250902 21:32:15.827255 4796 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
W20250902 21:32:15.832962 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20250902 21:32:15.841488 4747 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 1 messages since previous log ~0 seconds ago
I20250902 21:32:15.841704 4747 consensus_queue.cc:237] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6419, Committed index: 6419, Last appended: 1.6419, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6420 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } }
I20250902 21:32:15.842399 4619 raft_consensus.cc:1273] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Refusing update from remote peer 06e82aae52a24d6db2e51581ee7ca9ed: Log matching property violated. Preceding OpId in replica: term: 1 index: 6419. Preceding OpId from leader: term: 1 index: 6420. (index mismatch)
I20250902 21:32:15.842686 5367 consensus_queue.cc:1035] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Connected to new peer: Peer: permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6420, Last known committed idx: 6419, Time since last communication: 0.000s
I20250902 21:32:15.841488 4748 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 1 messages since previous log ~0 seconds ago
I20250902 21:32:15.843191 4748 consensus_queue.cc:237] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6417, Committed index: 6417, Last appended: 1.6420, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6421 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } }
I20250902 21:32:15.843461 5370 raft_consensus.cc:2953] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 LEADER]: Committing config change with OpId 1.6420: config changed from index -1 to 6420, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6420 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
W20250902 21:32:15.843956 4684 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:15.843989 4684 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20250902 21:32:15.843636 4619 raft_consensus.cc:2953] T 2f100030f5ec42f085bf83f379ebb850 P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Committing config change with OpId 1.6420: config changed from index -1 to 6420, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6420 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
I20250902 21:32:15.844470 5111 catalog_manager.cc:5095] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 2f100030f5ec42f085bf83f379ebb850 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20250902 21:32:15.844933 5127 catalog_manager.cc:5582] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed reported cstate change: config changed from index -1 to 6420, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New cstate: current_term: 1 leader_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" committed_config { opid_index: 6420 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:15.845424 4619 raft_consensus.cc:1273] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Refusing update from remote peer 06e82aae52a24d6db2e51581ee7ca9ed: Log matching property violated. Preceding OpId in replica: term: 1 index: 6420. Preceding OpId from leader: term: 1 index: 6422. (index mismatch)
I20250902 21:32:15.845752 5400 consensus_queue.cc:1035] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [LEADER]: Connected to new peer: Peer: permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6421, Last known committed idx: 6417, Time since last communication: 0.000s
I20250902 21:32:15.846961 4665 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:15.847069 5400 raft_consensus.cc:2953] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 LEADER]: Committing config change with OpId 1.6421: config changed from index -1 to 6421, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6421 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
I20250902 21:32:15.848057 5111 catalog_manager.cc:5095] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 32d5814bb7e34ba8a525018a3d441fc7 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20250902 21:32:15.848551 5127 catalog_manager.cc:5582] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed reported cstate change: config changed from index -1 to 6421, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New cstate: current_term: 1 leader_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" committed_config { opid_index: 6421 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20250902 21:32:15.849407 4683 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196:43591): Couldn't send request to peer 612f3c479a8a463b9f14f5ddab593fa5. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 2f100030f5ec42f085bf83f379ebb850. This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:15.849467 4683 consensus_peers.cc:489] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196:43591): Couldn't send request to peer 612f3c479a8a463b9f14f5ddab593fa5. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 32d5814bb7e34ba8a525018a3d441fc7. This is attempt 1: this message will repeat every 5th retry.
I20250902 21:32:15.851549 4619 raft_consensus.cc:2953] T 32d5814bb7e34ba8a525018a3d441fc7 P 9a6e0058b145476b9c65606690bad44c [term 1 FOLLOWER]: Committing config change with OpId 1.6421: config changed from index -1 to 6421, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6421 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
I20250902 21:32:15.854607 4616 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 1 messages since previous log ~0 seconds ago
I20250902 21:32:15.854736 4618 consensus_queue.cc:237] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6425, Committed index: 6425, Last appended: 1.6425, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } }
I20250902 21:32:15.854784 4616 consensus_queue.cc:237] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6422, Committed index: 6422, Last appended: 1.6425, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } }
I20250902 21:32:15.854745 4617 consensus_queue.cc:237] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6425, Committed index: 6425, Last appended: 1.6425, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } }
I20250902 21:32:15.854859 4615 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 1 messages since previous log ~0 seconds ago
I20250902 21:32:15.855564 4615 consensus_queue.cc:237] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6426, Committed index: 6426, Last appended: 2.6426, Last appended by leader: 130, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6427 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } }
I20250902 21:32:15.856145 4746 raft_consensus.cc:1273] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Refusing update from remote peer 9a6e0058b145476b9c65606690bad44c: Log matching property violated. Preceding OpId in replica: term: 1 index: 6425. Preceding OpId from leader: term: 1 index: 6426. (index mismatch)
I20250902 21:32:15.856276 4746 raft_consensus.cc:1273] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Refusing update from remote peer 9a6e0058b145476b9c65606690bad44c: Log matching property violated. Preceding OpId in replica: term: 1 index: 6425. Preceding OpId from leader: term: 1 index: 6426. (index mismatch)
I20250902 21:32:15.856289 4747 raft_consensus.cc:1273] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Refusing update from remote peer 9a6e0058b145476b9c65606690bad44c: Log matching property violated. Preceding OpId in replica: term: 1 index: 6425. Preceding OpId from leader: term: 1 index: 6426. (index mismatch)
I20250902 21:32:15.856431 4747 raft_consensus.cc:1273] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 2 FOLLOWER]: Refusing update from remote peer 9a6e0058b145476b9c65606690bad44c: Log matching property violated. Preceding OpId in replica: term: 2 index: 6426. Preceding OpId from leader: term: 2 index: 6427. (index mismatch)
W20250902 21:32:15.856655 4553 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:15.856698 4553 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:15.856734 4553 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20250902 21:32:15.856839 5195 consensus_queue.cc:1035] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [LEADER]: Connected to new peer: Peer: permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6426, Last known committed idx: 6425, Time since last communication: 0.000s
I20250902 21:32:15.856892 5086 consensus_queue.cc:1035] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [LEADER]: Connected to new peer: Peer: permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6426, Last known committed idx: 6422, Time since last communication: 0.000s
I20250902 21:32:15.857103 5086 consensus_queue.cc:1035] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [LEADER]: Connected to new peer: Peer: permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6426, Last known committed idx: 6425, Time since last communication: 0.000s
I20250902 21:32:15.857429 5195 consensus_queue.cc:1035] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [LEADER]: Connected to new peer: Peer: permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6427, Last known committed idx: 6426, Time since last communication: 0.000s
I20250902 21:32:15.858021 5086 raft_consensus.cc:2953] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c [term 1 LEADER]: Committing config change with OpId 1.6426: config changed from index -1 to 6426, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
I20250902 21:32:15.858192 5409 raft_consensus.cc:2953] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c [term 1 LEADER]: Committing config change with OpId 1.6426: config changed from index -1 to 6426, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
I20250902 21:32:15.859203 5127 catalog_manager.cc:5582] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c reported cstate change: config changed from index -1 to 6426, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New cstate: current_term: 1 leader_uuid: "9a6e0058b145476b9c65606690bad44c" committed_config { opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:15.859651 5111 catalog_manager.cc:5095] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 85463c5915304980b0a2aba153ed3da0 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20250902 21:32:15.859725 5111 catalog_manager.cc:5095] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 8c955e3402124b248b2740afe7cdff4d with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20250902 21:32:15.859809 5086 raft_consensus.cc:2953] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c [term 2 LEADER]: Committing config change with OpId 2.6427: config changed from index -1 to 6427, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6427 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
I20250902 21:32:15.860319 5071 raft_consensus.cc:2953] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c [term 1 LEADER]: Committing config change with OpId 1.6426: config changed from index -1 to 6426, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
I20250902 21:32:15.860546 5111 catalog_manager.cc:5095] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet b3f81968196f41b3a5e3bead654d937a with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20250902 21:32:15.860812 5127 catalog_manager.cc:5582] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c reported cstate change: config changed from index -1 to 6426, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New cstate: current_term: 1 leader_uuid: "9a6e0058b145476b9c65606690bad44c" committed_config { opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:15.860822 4746 raft_consensus.cc:2953] T 85463c5915304980b0a2aba153ed3da0 P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Committing config change with OpId 1.6426: config changed from index -1 to 6426, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
I20250902 21:32:15.860921 5127 catalog_manager.cc:5582] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c reported cstate change: config changed from index -1 to 6427, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New cstate: current_term: 2 leader_uuid: "9a6e0058b145476b9c65606690bad44c" committed_config { opid_index: 6427 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20250902 21:32:15.861253 4552 consensus_peers.cc:489] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c -> Peer 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196:43591): Couldn't send request to peer 612f3c479a8a463b9f14f5ddab593fa5. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 85463c5915304980b0a2aba153ed3da0. This is attempt 1: this message will repeat every 5th retry.
I20250902 21:32:15.861268 5111 catalog_manager.cc:5095] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 71a6015393b644f5abbd15f20c69a5ec with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
W20250902 21:32:15.861313 4552 consensus_peers.cc:489] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c -> Peer 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196:43591): Couldn't send request to peer 612f3c479a8a463b9f14f5ddab593fa5. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 8c955e3402124b248b2740afe7cdff4d. This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:15.861346 4552 consensus_peers.cc:489] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c -> Peer 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196:43591): Couldn't send request to peer 612f3c479a8a463b9f14f5ddab593fa5. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 71a6015393b644f5abbd15f20c69a5ec. This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:15.861382 4552 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196:43591): Couldn't send request to peer 612f3c479a8a463b9f14f5ddab593fa5. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: b3f81968196f41b3a5e3bead654d937a. This is attempt 1: this message will repeat every 5th retry.
W20250902 21:32:15.861855 4553 consensus_peers.cc:489] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c -> Peer 4dd1d49578df44b0a0325def79d07969 (127.4.52.193:41549): Couldn't send request to peer 4dd1d49578df44b0a0325def79d07969. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.193:41549: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20250902 21:32:15.862470 4746 raft_consensus.cc:2953] T 8c955e3402124b248b2740afe7cdff4d P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Committing config change with OpId 1.6426: config changed from index -1 to 6426, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
I20250902 21:32:15.862927 5127 catalog_manager.cc:5582] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c reported cstate change: config changed from index -1 to 6426, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New cstate: current_term: 1 leader_uuid: "9a6e0058b145476b9c65606690bad44c" committed_config { opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:15.863729 4746 raft_consensus.cc:2953] T b3f81968196f41b3a5e3bead654d937a P 06e82aae52a24d6db2e51581ee7ca9ed [term 2 FOLLOWER]: Committing config change with OpId 2.6427: config changed from index -1 to 6427, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6427 OBSOLETE_local: false peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
I20250902 21:32:15.864387 4746 raft_consensus.cc:2953] T 71a6015393b644f5abbd15f20c69a5ec P 06e82aae52a24d6db2e51581ee7ca9ed [term 1 FOLLOWER]: Committing config change with OpId 1.6426: config changed from index -1 to 6426, NON_VOTER 612f3c479a8a463b9f14f5ddab593fa5 (127.4.52.196) added. New config: { opid_index: 6426 OBSOLETE_local: false peers { permanent_uuid: "06e82aae52a24d6db2e51581ee7ca9ed" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 36693 } } peers { permanent_uuid: "9a6e0058b145476b9c65606690bad44c" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33533 } } peers { permanent_uuid: "4dd1d49578df44b0a0325def79d07969" member_type: VOTER last_known_addr { host: "127.4.52.193" port: 41549 } } peers { permanent_uuid: "612f3c479a8a463b9f14f5ddab593fa5" member_type: NON_VOTER last_known_addr { host: "127.4.52.196" port: 43591 } attrs { promote: true } } }
I20250902 21:32:15.903968 5059 heartbeater.cc:507] Master 127.4.52.254:35631 requested a full tablet report, sending...
I20250902 21:32:15.935700 5442 ts_tablet_manager.cc:927] T 32d5814bb7e34ba8a525018a3d441fc7 P 612f3c479a8a463b9f14f5ddab593fa5: Initiating tablet copy from peer 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:15.936205 5442 tablet_copy_client.cc:323] T 32d5814bb7e34ba8a525018a3d441fc7 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Beginning tablet copy session from remote peer at address 127.4.52.195:36693
I20250902 21:32:15.942989 5445 ts_tablet_manager.cc:927] T 71a6015393b644f5abbd15f20c69a5ec P 612f3c479a8a463b9f14f5ddab593fa5: Initiating tablet copy from peer 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533)
I20250902 21:32:15.943310 5446 ts_tablet_manager.cc:927] T b3f81968196f41b3a5e3bead654d937a P 612f3c479a8a463b9f14f5ddab593fa5: Initiating tablet copy from peer 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533)
I20250902 21:32:15.943421 5445 tablet_copy_client.cc:323] T 71a6015393b644f5abbd15f20c69a5ec P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Beginning tablet copy session from remote peer at address 127.4.52.194:33533
I20250902 21:32:15.944554 4770 tablet_copy_service.cc:140] P 06e82aae52a24d6db2e51581ee7ca9ed: Received BeginTabletCopySession request for tablet 32d5814bb7e34ba8a525018a3d441fc7 from peer 612f3c479a8a463b9f14f5ddab593fa5 ({username='slave'} at 127.4.52.196:60155)
I20250902 21:32:15.944625 4770 tablet_copy_service.cc:161] P 06e82aae52a24d6db2e51581ee7ca9ed: Beginning new tablet copy session on tablet 32d5814bb7e34ba8a525018a3d441fc7 from peer 612f3c479a8a463b9f14f5ddab593fa5 at {username='slave'} at 127.4.52.196:60155: session id = 612f3c479a8a463b9f14f5ddab593fa5-32d5814bb7e34ba8a525018a3d441fc7
I20250902 21:32:15.943549 5446 tablet_copy_client.cc:323] T b3f81968196f41b3a5e3bead654d937a P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Beginning tablet copy session from remote peer at address 127.4.52.194:33533
I20250902 21:32:15.945259 4770 tablet_copy_source_session.cc:215] T 32d5814bb7e34ba8a525018a3d441fc7 P 06e82aae52a24d6db2e51581ee7ca9ed: Tablet Copy: opened 0 blocks and 1 log segments
I20250902 21:32:15.946575 5442 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 32d5814bb7e34ba8a525018a3d441fc7. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:15.948175 4639 tablet_copy_service.cc:140] P 9a6e0058b145476b9c65606690bad44c: Received BeginTabletCopySession request for tablet 71a6015393b644f5abbd15f20c69a5ec from peer 612f3c479a8a463b9f14f5ddab593fa5 ({username='slave'} at 127.4.52.196:56629)
I20250902 21:32:15.948242 4639 tablet_copy_service.cc:161] P 9a6e0058b145476b9c65606690bad44c: Beginning new tablet copy session on tablet 71a6015393b644f5abbd15f20c69a5ec from peer 612f3c479a8a463b9f14f5ddab593fa5 at {username='slave'} at 127.4.52.196:56629: session id = 612f3c479a8a463b9f14f5ddab593fa5-71a6015393b644f5abbd15f20c69a5ec
I20250902 21:32:15.948884 4639 tablet_copy_source_session.cc:215] T 71a6015393b644f5abbd15f20c69a5ec P 9a6e0058b145476b9c65606690bad44c: Tablet Copy: opened 0 blocks and 1 log segments
I20250902 21:32:15.949095 4638 tablet_copy_service.cc:140] P 9a6e0058b145476b9c65606690bad44c: Received BeginTabletCopySession request for tablet b3f81968196f41b3a5e3bead654d937a from peer 612f3c479a8a463b9f14f5ddab593fa5 ({username='slave'} at 127.4.52.196:56629)
I20250902 21:32:15.949134 4638 tablet_copy_service.cc:161] P 9a6e0058b145476b9c65606690bad44c: Beginning new tablet copy session on tablet b3f81968196f41b3a5e3bead654d937a from peer 612f3c479a8a463b9f14f5ddab593fa5 at {username='slave'} at 127.4.52.196:56629: session id = 612f3c479a8a463b9f14f5ddab593fa5-b3f81968196f41b3a5e3bead654d937a
I20250902 21:32:15.949633 5445 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 71a6015393b644f5abbd15f20c69a5ec. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:15.951727 5445 tablet_copy_client.cc:806] T 71a6015393b644f5abbd15f20c69a5ec P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 0 data blocks...
I20250902 21:32:15.951874 5445 tablet_copy_client.cc:670] T 71a6015393b644f5abbd15f20c69a5ec P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 1 WAL segments...
I20250902 21:32:15.953130 4638 tablet_copy_source_session.cc:215] T b3f81968196f41b3a5e3bead654d937a P 9a6e0058b145476b9c65606690bad44c: Tablet Copy: opened 0 blocks and 1 log segments
I20250902 21:32:15.953832 5446 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b3f81968196f41b3a5e3bead654d937a. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:15.954941 5442 tablet_copy_client.cc:806] T 32d5814bb7e34ba8a525018a3d441fc7 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 0 data blocks...
I20250902 21:32:15.955088 5442 tablet_copy_client.cc:670] T 32d5814bb7e34ba8a525018a3d441fc7 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 1 WAL segments...
I20250902 21:32:15.955746 5446 tablet_copy_client.cc:806] T b3f81968196f41b3a5e3bead654d937a P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 0 data blocks...
I20250902 21:32:15.955852 5446 tablet_copy_client.cc:670] T b3f81968196f41b3a5e3bead654d937a P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 1 WAL segments...
I20250902 21:32:15.966180 5451 ts_tablet_manager.cc:927] T 85463c5915304980b0a2aba153ed3da0 P 612f3c479a8a463b9f14f5ddab593fa5: Initiating tablet copy from peer 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533)
I20250902 21:32:15.966375 5451 tablet_copy_client.cc:323] T 85463c5915304980b0a2aba153ed3da0 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Beginning tablet copy session from remote peer at address 127.4.52.194:33533
I20250902 21:32:15.966579 4637 tablet_copy_service.cc:140] P 9a6e0058b145476b9c65606690bad44c: Received BeginTabletCopySession request for tablet 85463c5915304980b0a2aba153ed3da0 from peer 612f3c479a8a463b9f14f5ddab593fa5 ({username='slave'} at 127.4.52.196:56629)
I20250902 21:32:15.966630 4637 tablet_copy_service.cc:161] P 9a6e0058b145476b9c65606690bad44c: Beginning new tablet copy session on tablet 85463c5915304980b0a2aba153ed3da0 from peer 612f3c479a8a463b9f14f5ddab593fa5 at {username='slave'} at 127.4.52.196:56629: session id = 612f3c479a8a463b9f14f5ddab593fa5-85463c5915304980b0a2aba153ed3da0
I20250902 21:32:15.967156 4637 tablet_copy_source_session.cc:215] T 85463c5915304980b0a2aba153ed3da0 P 9a6e0058b145476b9c65606690bad44c: Tablet Copy: opened 0 blocks and 1 log segments
I20250902 21:32:15.967547 5453 ts_tablet_manager.cc:927] T 2f100030f5ec42f085bf83f379ebb850 P 612f3c479a8a463b9f14f5ddab593fa5: Initiating tablet copy from peer 06e82aae52a24d6db2e51581ee7ca9ed (127.4.52.195:36693)
I20250902 21:32:15.967691 5453 tablet_copy_client.cc:323] T 2f100030f5ec42f085bf83f379ebb850 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Beginning tablet copy session from remote peer at address 127.4.52.195:36693
I20250902 21:32:15.968010 4769 tablet_copy_service.cc:140] P 06e82aae52a24d6db2e51581ee7ca9ed: Received BeginTabletCopySession request for tablet 2f100030f5ec42f085bf83f379ebb850 from peer 612f3c479a8a463b9f14f5ddab593fa5 ({username='slave'} at 127.4.52.196:60155)
I20250902 21:32:15.968067 4769 tablet_copy_service.cc:161] P 06e82aae52a24d6db2e51581ee7ca9ed: Beginning new tablet copy session on tablet 2f100030f5ec42f085bf83f379ebb850 from peer 612f3c479a8a463b9f14f5ddab593fa5 at {username='slave'} at 127.4.52.196:60155: session id = 612f3c479a8a463b9f14f5ddab593fa5-2f100030f5ec42f085bf83f379ebb850
I20250902 21:32:15.968616 4769 tablet_copy_source_session.cc:215] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed: Tablet Copy: opened 0 blocks and 1 log segments
I20250902 21:32:15.969650 5453 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 2f100030f5ec42f085bf83f379ebb850. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:15.970618 5453 tablet_copy_client.cc:806] T 2f100030f5ec42f085bf83f379ebb850 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 0 data blocks...
I20250902 21:32:15.970746 5453 tablet_copy_client.cc:670] T 2f100030f5ec42f085bf83f379ebb850 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 1 WAL segments...
I20250902 21:32:15.982059 5445 tablet_copy_client.cc:538] T 71a6015393b644f5abbd15f20c69a5ec P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250902 21:32:15.983276 5451 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 85463c5915304980b0a2aba153ed3da0. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:15.983621 5445 tablet_bootstrap.cc:492] T 71a6015393b644f5abbd15f20c69a5ec P 612f3c479a8a463b9f14f5ddab593fa5: Bootstrap starting.
I20250902 21:32:15.984472 5451 tablet_copy_client.cc:806] T 85463c5915304980b0a2aba153ed3da0 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 0 data blocks...
I20250902 21:32:15.984552 5450 ts_tablet_manager.cc:927] T 8c955e3402124b248b2740afe7cdff4d P 612f3c479a8a463b9f14f5ddab593fa5: Initiating tablet copy from peer 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533)
I20250902 21:32:15.984766 5450 tablet_copy_client.cc:323] T 8c955e3402124b248b2740afe7cdff4d P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Beginning tablet copy session from remote peer at address 127.4.52.194:33533
I20250902 21:32:15.984577 5451 tablet_copy_client.cc:670] T 85463c5915304980b0a2aba153ed3da0 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 1 WAL segments...
I20250902 21:32:15.988197 5446 tablet_copy_client.cc:538] T b3f81968196f41b3a5e3bead654d937a P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250902 21:32:15.989212 5446 tablet_bootstrap.cc:492] T b3f81968196f41b3a5e3bead654d937a P 612f3c479a8a463b9f14f5ddab593fa5: Bootstrap starting.
I20250902 21:32:15.990965 4639 tablet_copy_service.cc:140] P 9a6e0058b145476b9c65606690bad44c: Received BeginTabletCopySession request for tablet 8c955e3402124b248b2740afe7cdff4d from peer 612f3c479a8a463b9f14f5ddab593fa5 ({username='slave'} at 127.4.52.196:56629)
I20250902 21:32:15.991076 4639 tablet_copy_service.cc:161] P 9a6e0058b145476b9c65606690bad44c: Beginning new tablet copy session on tablet 8c955e3402124b248b2740afe7cdff4d from peer 612f3c479a8a463b9f14f5ddab593fa5 at {username='slave'} at 127.4.52.196:56629: session id = 612f3c479a8a463b9f14f5ddab593fa5-8c955e3402124b248b2740afe7cdff4d
I20250902 21:32:15.991750 4639 tablet_copy_source_session.cc:215] T 8c955e3402124b248b2740afe7cdff4d P 9a6e0058b145476b9c65606690bad44c: Tablet Copy: opened 0 blocks and 1 log segments
I20250902 21:32:16.001593 5450 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 8c955e3402124b248b2740afe7cdff4d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:16.003211 5450 tablet_copy_client.cc:806] T 8c955e3402124b248b2740afe7cdff4d P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 0 data blocks...
I20250902 21:32:16.003345 5450 tablet_copy_client.cc:670] T 8c955e3402124b248b2740afe7cdff4d P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Starting download of 1 WAL segments...
I20250902 21:32:16.004343 5453 tablet_copy_client.cc:538] T 2f100030f5ec42f085bf83f379ebb850 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250902 21:32:16.005143 5453 tablet_bootstrap.cc:492] T 2f100030f5ec42f085bf83f379ebb850 P 612f3c479a8a463b9f14f5ddab593fa5: Bootstrap starting.
I20250902 21:32:16.015390 5442 tablet_copy_client.cc:538] T 32d5814bb7e34ba8a525018a3d441fc7 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250902 21:32:16.016816 5442 tablet_bootstrap.cc:492] T 32d5814bb7e34ba8a525018a3d441fc7 P 612f3c479a8a463b9f14f5ddab593fa5: Bootstrap starting.
I20250902 21:32:16.028923 5451 tablet_copy_client.cc:538] T 85463c5915304980b0a2aba153ed3da0 P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250902 21:32:16.029863 5451 tablet_bootstrap.cc:492] T 85463c5915304980b0a2aba153ed3da0 P 612f3c479a8a463b9f14f5ddab593fa5: Bootstrap starting.
I20250902 21:32:16.036852 5450 tablet_copy_client.cc:538] T 8c955e3402124b248b2740afe7cdff4d P 612f3c479a8a463b9f14f5ddab593fa5: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250902 21:32:16.038084 5450 tablet_bootstrap.cc:492] T 8c955e3402124b248b2740afe7cdff4d P 612f3c479a8a463b9f14f5ddab593fa5: Bootstrap starting.
I20250902 21:32:16.171062 5453 log.cc:826] T 2f100030f5ec42f085bf83f379ebb850 P 612f3c479a8a463b9f14f5ddab593fa5: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:16.196086 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 4537
W20250902 21:32:16.231041 4683 connection.cc:537] client connection to 127.4.52.194:33533 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:16.231649 4683 consensus_peers.cc:489] T 2f100030f5ec42f085bf83f379ebb850 P 06e82aae52a24d6db2e51581ee7ca9ed -> Peer 9a6e0058b145476b9c65606690bad44c (127.4.52.194:33533): Couldn't send request to peer 9a6e0058b145476b9c65606690bad44c. Status: Network error: Client connection negotiation failed: client connection to 127.4.52.194:33533: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20250902 21:32:16.231937 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 4668
I20250902 21:32:16.263295 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 4864
I20250902 21:32:16.270273 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 5096
2025-09-02T21:32:16Z chronyd exiting
[ OK ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate (15387 ms)
[----------] 1 test from MaintenanceModeRF3ITest (15387 ms total)
[----------] 1 test from RollingRestartArgs/RollingRestartITest
[ RUN ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4
2025-09-02T21:32:16Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-09-02T21:32:16Z Disabled control of system clock
I20250902 21:32:16.317256 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.4.52.254:39913
--webserver_interface=127.4.52.254
--webserver_port=0
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.4.52.254:39913
--location_mapping_cmd=/tmp/dist-test-task6wlXYv/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/location-assignment.state --map /L0:4
--master_client_location_assignment_enabled=false with env {}
W20250902 21:32:16.388937 5468 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:16.389094 5468 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:16.389111 5468 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:16.390347 5468 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250902 21:32:16.390383 5468 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:16.390395 5468 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250902 21:32:16.390430 5468 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250902 21:32:16.392161 5468 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/wal
--location_mapping_cmd=/tmp/dist-test-task6wlXYv/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/location-assignment.state --map /L0:4
--ipki_ca_key_size=768
--master_addresses=127.4.52.254:39913
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.4.52.254:39913
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.4.52.254
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.5468
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:16.392377 5468 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:16.392716 5468 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:16.395460 5474 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:16.395467 5473 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:16.395711 5468 server_base.cc:1047] running on GCE node
W20250902 21:32:16.395592 5476 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:16.395953 5468 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:16.396174 5468 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:16.397306 5468 hybrid_clock.cc:648] HybridClock initialized: now 1756848736397287 us; error 34 us; skew 500 ppm
I20250902 21:32:16.398340 5468 webserver.cc:480] Webserver started at http://127.4.52.254:39049/ using document root <none> and password file <none>
I20250902 21:32:16.398514 5468 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:16.398556 5468 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:16.398660 5468 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20250902 21:32:16.399490 5468 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/data/instance:
uuid: "3d6d91ed1b3f4f41904ae5aa5381fb3d"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.399780 5468 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/wal/instance:
uuid: "3d6d91ed1b3f4f41904ae5aa5381fb3d"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.400837 5468 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.000s sys 0.002s
I20250902 21:32:16.401525 5482 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.401693 5468 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20250902 21:32:16.401767 5468 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/wal
uuid: "3d6d91ed1b3f4f41904ae5aa5381fb3d"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.401834 5468 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:16.409021 5468 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:16.409251 5468 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:16.409354 5468 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:16.412905 5468 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.254:39913
I20250902 21:32:16.412940 5534 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.254:39913 every 8 connection(s)
I20250902 21:32:16.413235 5468 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/data/info.pb
I20250902 21:32:16.413776 5535 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:16.415932 5535 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d: Bootstrap starting.
I20250902 21:32:16.416496 5535 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:16.416723 5535 log.cc:826] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:16.417256 5535 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d: No bootstrap required, opened a new log
I20250902 21:32:16.418320 5535 raft_consensus.cc:357] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3d6d91ed1b3f4f41904ae5aa5381fb3d" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 39913 } }
I20250902 21:32:16.418427 5535 raft_consensus.cc:383] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:16.418447 5535 raft_consensus.cc:738] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3d6d91ed1b3f4f41904ae5aa5381fb3d, State: Initialized, Role: FOLLOWER
I20250902 21:32:16.418546 5535 consensus_queue.cc:260] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3d6d91ed1b3f4f41904ae5aa5381fb3d" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 39913 } }
I20250902 21:32:16.418617 5535 raft_consensus.cc:397] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250902 21:32:16.418653 5535 raft_consensus.cc:491] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250902 21:32:16.418684 5535 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:16.419155 5535 raft_consensus.cc:513] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3d6d91ed1b3f4f41904ae5aa5381fb3d" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 39913 } }
I20250902 21:32:16.419234 5535 leader_election.cc:304] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 3d6d91ed1b3f4f41904ae5aa5381fb3d; no voters:
I20250902 21:32:16.419339 5535 leader_election.cc:290] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [CANDIDATE]: Term 1 election: Requested vote from peers
I20250902 21:32:16.419413 5538 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [term 1 FOLLOWER]: Leader election won for term 1
I20250902 21:32:16.419523 5538 raft_consensus.cc:695] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [term 1 LEADER]: Becoming Leader. State: Replica: 3d6d91ed1b3f4f41904ae5aa5381fb3d, State: Running, Role: LEADER
I20250902 21:32:16.419581 5535 sys_catalog.cc:564] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [sys.catalog]: configured and running, proceeding with master startup.
I20250902 21:32:16.419677 5538 consensus_queue.cc:237] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3d6d91ed1b3f4f41904ae5aa5381fb3d" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 39913 } }
I20250902 21:32:16.420064 5539 sys_catalog.cc:455] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "3d6d91ed1b3f4f41904ae5aa5381fb3d" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3d6d91ed1b3f4f41904ae5aa5381fb3d" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 39913 } } }
I20250902 21:32:16.420107 5540 sys_catalog.cc:455] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [sys.catalog]: SysCatalogTable state changed. Reason: New leader 3d6d91ed1b3f4f41904ae5aa5381fb3d. Latest consensus state: current_term: 1 leader_uuid: "3d6d91ed1b3f4f41904ae5aa5381fb3d" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "3d6d91ed1b3f4f41904ae5aa5381fb3d" member_type: VOTER last_known_addr { host: "127.4.52.254" port: 39913 } } }
I20250902 21:32:16.420143 5539 sys_catalog.cc:458] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [sys.catalog]: This master's current role is: LEADER
I20250902 21:32:16.420293 5540 sys_catalog.cc:458] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d [sys.catalog]: This master's current role is: LEADER
I20250902 21:32:16.420512 5544 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250902 21:32:16.420790 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 5468
I20250902 21:32:16.420886 4307 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/master-0/wal/instance
I20250902 21:32:16.421718 5544 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250902 21:32:16.422983 5544 catalog_manager.cc:1349] Generated new cluster ID: 81e7a186c76343f4b5f86bab5de13b3e
I20250902 21:32:16.423031 5544 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250902 21:32:16.435735 5544 catalog_manager.cc:1372] Generated new certificate authority record
I20250902 21:32:16.436201 5544 catalog_manager.cc:1506] Loading token signing keys...
I20250902 21:32:16.440954 5544 catalog_manager.cc:5955] T 00000000000000000000000000000000 P 3d6d91ed1b3f4f41904ae5aa5381fb3d: Generated new TSK 0
I20250902 21:32:16.441203 5544 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250902 21:32:16.447991 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.193:0
--local_ip_for_outbound_sockets=127.4.52.193
--webserver_interface=127.4.52.193
--webserver_port=0
--tserver_master_addrs=127.4.52.254:39913
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:16.530153 5559 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:16.530309 5559 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:16.530329 5559 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:16.532001 5559 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:16.532068 5559 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.193
I20250902 21:32:16.533607 5559 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.193:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.4.52.193
--webserver_port=0
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.5559
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.193
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:16.533800 5559 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:16.534027 5559 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:16.536487 5565 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:16.536491 5567 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:16.536597 5564 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:16.536788 5559 server_base.cc:1047] running on GCE node
I20250902 21:32:16.536943 5559 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:16.537153 5559 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:16.538341 5559 hybrid_clock.cc:648] HybridClock initialized: now 1756848736538313 us; error 43 us; skew 500 ppm
I20250902 21:32:16.539639 5559 webserver.cc:480] Webserver started at http://127.4.52.193:43863/ using document root <none> and password file <none>
I20250902 21:32:16.539849 5559 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:16.539899 5559 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:16.540035 5559 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20250902 21:32:16.541131 5559 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/instance:
uuid: "6311ff5fc63a49108dc3000117399229"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.541492 5559 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal/instance:
uuid: "6311ff5fc63a49108dc3000117399229"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.543037 5559 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.000s sys 0.003s
I20250902 21:32:16.543843 5573 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.544015 5559 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:16.544090 5559 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
uuid: "6311ff5fc63a49108dc3000117399229"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.544147 5559 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:16.567368 5559 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:16.567618 5559 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:16.567724 5559 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:16.567926 5559 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:16.568219 5559 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:16.568248 5559 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.568279 5559 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:16.568305 5559 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.573314 5559 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.193:34353
I20250902 21:32:16.573416 5686 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.193:34353 every 8 connection(s)
I20250902 21:32:16.573657 5559 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
I20250902 21:32:16.579048 5687 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:16.579140 5687 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:16.579334 5687 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:16.583036 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 5559
I20250902 21:32:16.583127 4307 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal/instance
I20250902 21:32:16.584724 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.194:0
--local_ip_for_outbound_sockets=127.4.52.194
--webserver_interface=127.4.52.194
--webserver_port=0
--tserver_master_addrs=127.4.52.254:39913
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:16.623010 5499 ts_manager.cc:194] Registered new tserver with Master: 6311ff5fc63a49108dc3000117399229 (127.4.52.193:34353)
I20250902 21:32:16.623690 5499 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.193:37695
W20250902 21:32:16.672096 5691 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:16.672272 5691 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:16.672302 5691 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:16.673564 5691 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:16.673616 5691 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.194
I20250902 21:32:16.675007 5691 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.194:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.4.52.194
--webserver_port=0
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.5691
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.194
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:16.675209 5691 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:16.675422 5691 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:16.678058 5696 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:16.678030 5697 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:16.678066 5691 server_base.cc:1047] running on GCE node
W20250902 21:32:16.678033 5699 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:16.678416 5691 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:16.678581 5691 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:16.679699 5691 hybrid_clock.cc:648] HybridClock initialized: now 1756848736679688 us; error 28 us; skew 500 ppm
I20250902 21:32:16.680743 5691 webserver.cc:480] Webserver started at http://127.4.52.194:37969/ using document root <none> and password file <none>
I20250902 21:32:16.680922 5691 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:16.680963 5691 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:16.681068 5691 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20250902 21:32:16.681921 5691 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/instance:
uuid: "b41910bc980f4cfdbcc6eb23e1084325"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.682204 5691 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal/instance:
uuid: "b41910bc980f4cfdbcc6eb23e1084325"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.683406 5691 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.002s sys 0.000s
I20250902 21:32:16.684069 5705 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.684252 5691 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:16.684314 5691 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
uuid: "b41910bc980f4cfdbcc6eb23e1084325"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.684362 5691 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:16.695538 5691 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:16.695736 5691 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:16.695814 5691 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:16.695976 5691 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:16.696205 5691 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:16.696231 5691 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.696257 5691 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:16.696274 5691 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.701249 5691 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.194:33503
I20250902 21:32:16.701347 5818 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.194:33503 every 8 connection(s)
I20250902 21:32:16.701594 5691 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
I20250902 21:32:16.703433 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 5691
I20250902 21:32:16.703531 4307 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal/instance
I20250902 21:32:16.704524 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.195:0
--local_ip_for_outbound_sockets=127.4.52.195
--webserver_interface=127.4.52.195
--webserver_port=0
--tserver_master_addrs=127.4.52.254:39913
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:16.706151 5819 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:16.706254 5819 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:16.706457 5819 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:16.751830 5499 ts_manager.cc:194] Registered new tserver with Master: b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:16.752359 5499 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.194:47633
W20250902 21:32:16.791203 5822 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:16.791357 5822 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:16.791378 5822 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:16.792703 5822 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:16.792753 5822 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.195
I20250902 21:32:16.794119 5822 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.195:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.4.52.195
--webserver_port=0
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.5822
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.195
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:16.794322 5822 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:16.794544 5822 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:16.797063 5828 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:16.797120 5829 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:16.797222 5831 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:16.797302 5822 server_base.cc:1047] running on GCE node
I20250902 21:32:16.797451 5822 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:16.797636 5822 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:16.798786 5822 hybrid_clock.cc:648] HybridClock initialized: now 1756848736798771 us; error 32 us; skew 500 ppm
I20250902 21:32:16.799767 5822 webserver.cc:480] Webserver started at http://127.4.52.195:32789/ using document root <none> and password file <none>
I20250902 21:32:16.799933 5822 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:16.799969 5822 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:16.800067 5822 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20250902 21:32:16.800832 5822 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/instance:
uuid: "f79e0a34bf4d4181aed51bd155010a25"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.801131 5822 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal/instance:
uuid: "f79e0a34bf4d4181aed51bd155010a25"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.802284 5822 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.002s sys 0.000s
I20250902 21:32:16.803027 5837 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.803164 5822 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:16.803238 5822 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
uuid: "f79e0a34bf4d4181aed51bd155010a25"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.803280 5822 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:16.810010 5822 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:16.810168 5822 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:16.810241 5822 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:16.810375 5822 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:16.810664 5822 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:16.810691 5822 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.810717 5822 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:16.810735 5822 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.815623 5822 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.195:46587
I20250902 21:32:16.815726 5950 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.195:46587 every 8 connection(s)
I20250902 21:32:16.815960 5822 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
I20250902 21:32:16.819301 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 5822
I20250902 21:32:16.819404 4307 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal/instance
I20250902 21:32:16.820516 5951 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:16.820571 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.196:0
--local_ip_for_outbound_sockets=127.4.52.196
--webserver_interface=127.4.52.196
--webserver_port=0
--tserver_master_addrs=127.4.52.254:39913
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:16.820600 5951 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:16.820817 5951 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:16.855397 5499 ts_manager.cc:194] Registered new tserver with Master: f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:16.855862 5499 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.195:51843
W20250902 21:32:16.895213 5954 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:16.895361 5954 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:16.895380 5954 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:16.896646 5954 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:16.896695 5954 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.196
I20250902 21:32:16.898022 5954 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.196:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.4.52.196
--webserver_port=0
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.5954
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.196
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:16.898219 5954 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:16.898417 5954 file_cache.cc:492] Constructed file cache file cache with capacity 419430
I20250902 21:32:16.900746 5954 server_base.cc:1047] running on GCE node
W20250902 21:32:16.900867 5960 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:16.900959 5961 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:16.900750 5963 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:16.901206 5954 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:16.901396 5954 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:16.902541 5954 hybrid_clock.cc:648] HybridClock initialized: now 1756848736902521 us; error 41 us; skew 500 ppm
I20250902 21:32:16.903596 5954 webserver.cc:480] Webserver started at http://127.4.52.196:36885/ using document root <none> and password file <none>
I20250902 21:32:16.903774 5954 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:16.903829 5954 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:16.903934 5954 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20250902 21:32:16.904707 5954 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/instance:
uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.905000 5954 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal/instance:
uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.906086 5954 fs_manager.cc:696] Time spent creating directory manager: real 0.001s user 0.002s sys 0.000s
I20250902 21:32:16.906713 5969 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.906874 5954 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:16.906935 5954 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:16.906991 5954 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:16.928633 5954 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:16.928867 5954 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:16.928967 5954 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:16.929188 5954 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:16.929483 5954 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:16.929513 5954 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.929540 5954 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:16.929559 5954 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:16.934679 5954 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.196:35541
I20250902 21:32:16.934741 6082 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.196:35541 every 8 connection(s)
I20250902 21:32:16.935019 5954 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
I20250902 21:32:16.935388 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 5954
I20250902 21:32:16.935493 4307 external_mini_cluster.cc:1442] Reading /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal/instance
I20250902 21:32:16.939144 6083 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:16.939218 6083 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:16.939386 6083 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:16.971014 5499 ts_manager.cc:194] Registered new tserver with Master: 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541)
I20250902 21:32:16.971478 5499 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.196:55619
I20250902 21:32:16.980197 4307 external_mini_cluster.cc:949] 4 TS(s) registered with all masters
I20250902 21:32:16.985787 4307 test_util.cc:276] Using random seed: 954491577
I20250902 21:32:16.991812 5499 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:38752:
name: "test-workload"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
I20250902 21:32:16.998449 5753 tablet_service.cc:1468] Processing CreateTablet for tablet 1bd11ec67527495b831ab711f8a2f39b (DEFAULT_TABLE table=test-workload [id=48619a55c4144f7ca28dc868607a6e3c]), partition=RANGE (key) PARTITION UNBOUNDED
I20250902 21:32:16.998596 5885 tablet_service.cc:1468] Processing CreateTablet for tablet 1bd11ec67527495b831ab711f8a2f39b (DEFAULT_TABLE table=test-workload [id=48619a55c4144f7ca28dc868607a6e3c]), partition=RANGE (key) PARTITION UNBOUNDED
I20250902 21:32:16.998777 5753 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1bd11ec67527495b831ab711f8a2f39b. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:16.998852 5885 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1bd11ec67527495b831ab711f8a2f39b. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:16.998859 6017 tablet_service.cc:1468] Processing CreateTablet for tablet 1bd11ec67527495b831ab711f8a2f39b (DEFAULT_TABLE table=test-workload [id=48619a55c4144f7ca28dc868607a6e3c]), partition=RANGE (key) PARTITION UNBOUNDED
I20250902 21:32:16.999096 6017 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1bd11ec67527495b831ab711f8a2f39b. 1 dirs total, 0 dirs full, 0 dirs failed
I20250902 21:32:17.001019 6106 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap starting.
I20250902 21:32:17.001303 6107 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap starting.
I20250902 21:32:17.001490 6108 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap starting.
I20250902 21:32:17.001946 6107 tablet_bootstrap.cc:654] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:17.001981 6106 tablet_bootstrap.cc:654] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:17.002075 6108 tablet_bootstrap.cc:654] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Neither blocks nor log segments found. Creating new log.
I20250902 21:32:17.002295 6107 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:17.002351 6108 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:17.002419 6106 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:17.002964 6107 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: No bootstrap required, opened a new log
I20250902 21:32:17.003001 6106 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: No bootstrap required, opened a new log
I20250902 21:32:17.003021 6108 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: No bootstrap required, opened a new log
I20250902 21:32:17.003022 6107 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent bootstrapping tablet: real 0.002s user 0.002s sys 0.000s
I20250902 21:32:17.003039 6106 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent bootstrapping tablet: real 0.002s user 0.001s sys 0.000s
I20250902 21:32:17.003064 6108 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent bootstrapping tablet: real 0.002s user 0.001s sys 0.000s
I20250902 21:32:17.004070 6106 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.004159 6106 raft_consensus.cc:383] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:17.004176 6106 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f79e0a34bf4d4181aed51bd155010a25, State: Initialized, Role: FOLLOWER
I20250902 21:32:17.004242 6106 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.004361 6108 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.004361 6107 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.004475 6108 raft_consensus.cc:383] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:17.004482 6107 raft_consensus.cc:383] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250902 21:32:17.004477 6106 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent starting tablet: real 0.001s user 0.002s sys 0.000s
I20250902 21:32:17.004498 5951 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:17.004509 6107 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: b41910bc980f4cfdbcc6eb23e1084325, State: Initialized, Role: FOLLOWER
I20250902 21:32:17.004508 6108 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Initialized, Role: FOLLOWER
I20250902 21:32:17.004597 6107 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.004597 6108 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.004797 6108 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent starting tablet: real 0.002s user 0.002s sys 0.000s
I20250902 21:32:17.004804 6107 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent starting tablet: real 0.002s user 0.002s sys 0.000s
I20250902 21:32:17.004912 6083 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:17.005103 5819 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
W20250902 21:32:17.066457 5952 tablet.cc:2378] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250902 21:32:17.081969 6114 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:17.082080 6114 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.082397 6114 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:17.085631 5905 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
I20250902 21:32:17.085631 5773 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325" is_pre_election: true
I20250902 21:32:17.085778 5905 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 0.
I20250902 21:32:17.085778 5773 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 0.
I20250902 21:32:17.086222 5970 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, b41910bc980f4cfdbcc6eb23e1084325; no voters:
I20250902 21:32:17.086354 6114 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250902 21:32:17.086426 6114 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:17.086464 6114 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:17.087133 6114 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.087241 6114 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 1 election: Requested vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:17.087401 5905 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
I20250902 21:32:17.087466 5773 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
I20250902 21:32:17.087486 5905 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:17.087530 5773 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 0 FOLLOWER]: Advancing to term 1
I20250902 21:32:17.088217 5905 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 1.
I20250902 21:32:17.088240 5773 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 1.
I20250902 21:32:17.088382 5970 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, b41910bc980f4cfdbcc6eb23e1084325; no voters:
I20250902 21:32:17.088461 6114 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 1 FOLLOWER]: Leader election won for term 1
I20250902 21:32:17.088596 6114 raft_consensus.cc:695] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 1 LEADER]: Becoming Leader. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Running, Role: LEADER
I20250902 21:32:17.088680 6114 consensus_queue.cc:237] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.089358 5499 catalog_manager.cc:5582] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 reported cstate change: term changed from 0 to 1, leader changed from <none> to 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196). New cstate: current_term: 1 leader_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:17.109192 4307 maintenance_mode-itest.cc:745] Restarting batch of 4 tservers: b41910bc980f4cfdbcc6eb23e1084325,6311ff5fc63a49108dc3000117399229,3a35f7f28cb9438dbcfb3196e167fdc5,f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:17.147246 5773 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 1 FOLLOWER]: Refusing update from remote peer 3a35f7f28cb9438dbcfb3196e167fdc5: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250902 21:32:17.147668 6114 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:17.149694 5905 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 1 FOLLOWER]: Refusing update from remote peer 3a35f7f28cb9438dbcfb3196e167fdc5: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250902 21:32:17.149961 6114 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250902 21:32:17.154531 6133 mvcc.cc:204] Tried to move back new op lower bound from 7196052427328307200 to 7196052427115728896. Current Snapshot: MvccSnapshot[applied={T|T < 7196052427328307200}]
I20250902 21:32:17.157307 6136 mvcc.cc:204] Tried to move back new op lower bound from 7196052427328307200 to 7196052427115728896. Current Snapshot: MvccSnapshot[applied={T|T < 7196052427328307200}]
I20250902 21:32:17.158187 6134 mvcc.cc:204] Tried to move back new op lower bound from 7196052427328307200 to 7196052427115728896. Current Snapshot: MvccSnapshot[applied={T|T < 7196052427328307200}]
I20250902 21:32:17.252782 5495 ts_manager.cc:295] Set tserver state for 6311ff5fc63a49108dc3000117399229 to MAINTENANCE_MODE
I20250902 21:32:17.257468 5495 ts_manager.cc:295] Set tserver state for 3a35f7f28cb9438dbcfb3196e167fdc5 to MAINTENANCE_MODE
I20250902 21:32:17.323181 5495 ts_manager.cc:295] Set tserver state for b41910bc980f4cfdbcc6eb23e1084325 to MAINTENANCE_MODE
I20250902 21:32:17.339018 5495 ts_manager.cc:295] Set tserver state for f79e0a34bf4d4181aed51bd155010a25 to MAINTENANCE_MODE
I20250902 21:32:17.516408 5621 tablet_service.cc:1423] Tablet server 6311ff5fc63a49108dc3000117399229 set to quiescing
I20250902 21:32:17.516487 5621 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:17.567884 6017 tablet_service.cc:1423] Tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 set to quiescing
I20250902 21:32:17.567942 6017 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250902 21:32:17.569154 6150 raft_consensus.cc:991] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: : Instructing follower f79e0a34bf4d4181aed51bd155010a25 to start an election
I20250902 21:32:17.569321 6116 raft_consensus.cc:1079] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 1 LEADER]: Signalling peer f79e0a34bf4d4181aed51bd155010a25 to start an election
I20250902 21:32:17.570366 5905 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b"
dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
from {username='slave'} at 127.4.52.196:37797
I20250902 21:32:17.570521 5905 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 1 FOLLOWER]: Starting forced leader election (received explicit request)
I20250902 21:32:17.570560 5905 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 1 FOLLOWER]: Advancing to term 2
I20250902 21:32:17.571388 5905 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 2 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.571838 5905 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [CANDIDATE]: Term 2 election: Requested vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:17.573041 5905 raft_consensus.cc:1238] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 2 FOLLOWER]: Rejecting Update request from peer 3a35f7f28cb9438dbcfb3196e167fdc5 for earlier term 1. Current term is 2. Ops: [1.193-1.195]
I20250902 21:32:17.573343 6116 consensus_queue.cc:1046] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 }, Status: INVALID_TERM, Last received: 1.192, Next index: 193, Last known committed idx: 192, Time since last communication: 0.000s
I20250902 21:32:17.573424 6116 raft_consensus.cc:991] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: : Instructing follower b41910bc980f4cfdbcc6eb23e1084325 to start an election
I20250902 21:32:17.573451 6116 raft_consensus.cc:1079] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 1 LEADER]: Signalling peer b41910bc980f4cfdbcc6eb23e1084325 to start an election
I20250902 21:32:17.573494 6116 raft_consensus.cc:3053] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 1 LEADER]: Stepping down as leader of term 1
I20250902 21:32:17.573515 6116 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 1 LEADER]: Becoming Follower/Learner. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Running, Role: LEADER
I20250902 21:32:17.573559 6116 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 195, Committed index: 195, Last appended: 1.195, Last appended by leader: 195, Current term: 1, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.573643 6116 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 1 FOLLOWER]: Advancing to term 2
I20250902 21:32:17.574792 5773 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b"
dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
from {username='slave'} at 127.4.52.196:55049
I20250902 21:32:17.574878 5773 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 1 FOLLOWER]: Starting forced leader election (received explicit request)
I20250902 21:32:17.574906 5773 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 1 FOLLOWER]: Advancing to term 2
I20250902 21:32:17.575584 5773 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 2 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.575819 5773 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 2 election: Requested vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:17.576030 6037 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "f79e0a34bf4d4181aed51bd155010a25" candidate_term: 2 candidate_status { last_received { term: 1 index: 192 } } ignore_live_leader: true dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
I20250902 21:32:17.576117 6037 raft_consensus.cc:2408] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate f79e0a34bf4d4181aed51bd155010a25 for term 2 because replica has last-logged OpId of term: 1 index: 195, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 192.
I20250902 21:32:17.578317 5753 tablet_service.cc:1423] Tablet server b41910bc980f4cfdbcc6eb23e1084325 set to quiescing
I20250902 21:32:17.578375 5753 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:17.579250 6037 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 2 candidate_status { last_received { term: 1 index: 195 } } ignore_live_leader: true dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
I20250902 21:32:17.580024 6037 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate b41910bc980f4cfdbcc6eb23e1084325 in term 2.
I20250902 21:32:17.580199 5708 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, b41910bc980f4cfdbcc6eb23e1084325; no voters:
I20250902 21:32:17.580299 6113 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 2 FOLLOWER]: Leader election won for term 2
I20250902 21:32:17.580392 6113 raft_consensus.cc:695] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 2 LEADER]: Becoming Leader. State: Replica: b41910bc980f4cfdbcc6eb23e1084325, State: Running, Role: LEADER
I20250902 21:32:17.580462 6113 consensus_queue.cc:237] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 192, Committed index: 192, Last appended: 1.195, Last appended by leader: 195, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:17.580633 5905 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 2 candidate_status { last_received { term: 1 index: 195 } } ignore_live_leader: true dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
I20250902 21:32:17.580720 5905 raft_consensus.cc:2391] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate b41910bc980f4cfdbcc6eb23e1084325 in current term 2: Already voted for candidate f79e0a34bf4d4181aed51bd155010a25 in this term.
I20250902 21:32:17.580786 5773 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "f79e0a34bf4d4181aed51bd155010a25" candidate_term: 2 candidate_status { last_received { term: 1 index: 192 } } ignore_live_leader: true dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
I20250902 21:32:17.580848 5773 raft_consensus.cc:2391] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 2 LEADER]: Leader election vote request: Denying vote to candidate f79e0a34bf4d4181aed51bd155010a25 in current term 2: Already voted for candidate b41910bc980f4cfdbcc6eb23e1084325 in this term.
I20250902 21:32:17.581120 5838 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: f79e0a34bf4d4181aed51bd155010a25; no voters: 3a35f7f28cb9438dbcfb3196e167fdc5, b41910bc980f4cfdbcc6eb23e1084325
I20250902 21:32:17.581173 5885 tablet_service.cc:1423] Tablet server f79e0a34bf4d4181aed51bd155010a25 set to quiescing
I20250902 21:32:17.581204 5885 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:17.581125 5495 catalog_manager.cc:5582] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 reported cstate change: term changed from 1 to 2, leader changed from 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196) to b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194). New cstate: current_term: 2 leader_uuid: "b41910bc980f4cfdbcc6eb23e1084325" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } health_report { overall_health: HEALTHY } } }
I20250902 21:32:17.581279 6112 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
I20250902 21:32:17.624557 5687 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:17.672271 5905 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 2 FOLLOWER]: Refusing update from remote peer b41910bc980f4cfdbcc6eb23e1084325: Log matching property violated. Preceding OpId in replica: term: 1 index: 192. Preceding OpId from leader: term: 2 index: 196. (index mismatch)
I20250902 21:32:17.672605 6236 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 196, Last known committed idx: 192, Time since last communication: 0.000s
I20250902 21:32:17.674202 6037 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 2 FOLLOWER]: Refusing update from remote peer b41910bc980f4cfdbcc6eb23e1084325: Log matching property violated. Preceding OpId in replica: term: 1 index: 195. Preceding OpId from leader: term: 2 index: 196. (index mismatch)
I20250902 21:32:17.674432 6236 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 196, Last known committed idx: 195, Time since last communication: 0.000s
W20250902 21:32:17.675977 5995 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:17.676010 5997 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:17.676091 5996 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:17.677707 5863 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:17.678704 5863 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:17.679783 5863 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:18.754855 6017 tablet_service.cc:1423] Tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 set to quiescing
I20250902 21:32:18.754917 6017 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:18.813241 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 5691
W20250902 21:32:18.819860 5839 connection.cc:537] server connection from 127.4.52.194:46155 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:18.819994 5972 connection.cc:537] server connection from 127.4.52.194:34801 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:18.820276 6092 connection.cc:537] client connection to 127.4.52.194:33503 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:18.820358 6092 meta_cache.cc:302] tablet 1bd11ec67527495b831ab711f8a2f39b: replica b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503) has failed: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20250902 21:32:18.820591 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.194:33503
--local_ip_for_outbound_sockets=127.4.52.194
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=37969
--webserver_interface=127.4.52.194
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:18.822599 5995 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.823640 5995 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.827126 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.827126 5863 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.838446 5996 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.838472 5995 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.845120 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.846114 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.861835 5995 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.863780 5995 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.874511 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.874511 5863 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.894399 6252 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:18.894560 6252 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:18.894588 6252 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:18.895869 6252 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:18.895924 6252 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.194
I20250902 21:32:18.897185 6252 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.194:33503
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.4.52.194
--webserver_port=37969
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.6252
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.194
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:18.897384 6252 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:18.897585 6252 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:18.899073 5996 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.899104 5995 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:18.900295 6252 server_base.cc:1047] running on GCE node
W20250902 21:32:18.900363 6261 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:18.900279 6259 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:18.900377 6258 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:18.900676 6252 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:18.900830 6252 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:18.901952 6252 hybrid_clock.cc:648] HybridClock initialized: now 1756848738901930 us; error 36 us; skew 500 ppm
I20250902 21:32:18.903046 6252 webserver.cc:480] Webserver started at http://127.4.52.194:37969/ using document root <none> and password file <none>
I20250902 21:32:18.903229 6252 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:18.903281 6252 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:18.904330 6252 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:18.904878 6267 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:18.905038 6252 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:18.905099 6252 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
uuid: "b41910bc980f4cfdbcc6eb23e1084325"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:18.905364 6252 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20250902 21:32:18.910984 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.915009 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:18.917598 6252 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:18.917801 6252 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:18.917989 6252 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:18.918228 6252 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:18.918624 6274 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:18.919432 6252 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:18.919471 6252 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:18.919505 6252 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:18.920004 6252 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:18.920073 6252 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:18.920104 6274 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap starting.
I20250902 21:32:18.926671 6252 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.194:33503
I20250902 21:32:18.927032 6252 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
I20250902 21:32:18.927157 6381 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.194:33503 every 8 connection(s)
I20250902 21:32:18.932696 6382 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:18.932770 6382 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:18.932907 6382 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:18.933379 5495 ts_manager.cc:194] Re-registered known tserver with Master: b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:18.933900 5495 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.194:35937
I20250902 21:32:18.935075 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 6252
I20250902 21:32:18.935168 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 5559
I20250902 21:32:18.939996 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.193:34353
--local_ip_for_outbound_sockets=127.4.52.193
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=43863
--webserver_interface=127.4.52.193
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:18.949011 5995 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:18.959667 6274 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Log is configured to *not* fsync() on all Append() calls
W20250902 21:32:18.962168 5995 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.963874 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:18.979081 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.004091 5995 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.014190 5995 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48944: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.016559 6387 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:19.016709 6387 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:19.016733 6387 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:19.018510 6387 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:19.018565 6387 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.193
I20250902 21:32:19.019959 6387 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.193:34353
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.4.52.193
--webserver_port=43863
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.6387
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.193
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:19.020175 6387 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:19.020390 6387 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:19.022969 6394 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:19.023027 6387 server_base.cc:1047] running on GCE node
W20250902 21:32:19.023052 6393 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:19.022951 6396 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:19.023316 6387 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:19.023522 6387 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
W20250902 21:32:19.023890 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:19.030804 6387 hybrid_clock.cc:648] HybridClock initialized: now 1756848739030772 us; error 48 us; skew 500 ppm
I20250902 21:32:19.032285 6387 webserver.cc:480] Webserver started at http://127.4.52.193:43863/ using document root <none> and password file <none>
I20250902 21:32:19.032532 6387 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:19.032620 6387 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:19.034047 6387 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:19.034685 6402 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:19.034857 6387 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:19.034920 6387 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
uuid: "6311ff5fc63a49108dc3000117399229"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:19.035171 6387 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20250902 21:32:19.036065 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:19.043663 6387 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:19.043852 6387 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:19.043946 6387 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:19.044101 6387 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:19.044389 6387 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:19.044427 6387 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:19.044457 6387 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:19.044473 6387 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:19.050953 6387 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.193:34353
I20250902 21:32:19.051090 6515 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.193:34353 every 8 connection(s)
I20250902 21:32:19.051345 6387 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
I20250902 21:32:19.054245 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 6387
I20250902 21:32:19.054344 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 5954
I20250902 21:32:19.062326 6516 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:19.062501 6516 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:19.062849 6516 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:19.063093 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.196:35541
--local_ip_for_outbound_sockets=127.4.52.196
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=36885
--webserver_interface=127.4.52.196
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:19.063282 5495 ts_manager.cc:194] Re-registered known tserver with Master: 6311ff5fc63a49108dc3000117399229 (127.4.52.193:34353)
I20250902 21:32:19.063791 5495 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.193:46459
W20250902 21:32:19.068262 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.084061 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.114077 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.114286 6520 raft_consensus.cc:668] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: failed to trigger leader election: Illegal state: leader elections are disabled
W20250902 21:32:19.158419 6519 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:19.158577 6519 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:19.158596 6519 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:19.159857 6519 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:19.159906 6519 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.196
I20250902 21:32:19.161245 6519 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.196:35541
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.4.52.196
--webserver_port=36885
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.6519
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.196
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:19.161438 6519 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:19.161654 6519 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:19.161906 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.164381 6525 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:19.164369 6528 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:19.164369 6526 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:19.164541 6519 server_base.cc:1047] running on GCE node
I20250902 21:32:19.164753 6519 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:19.164929 6519 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:19.166065 6519 hybrid_clock.cc:648] HybridClock initialized: now 1756848739166052 us; error 27 us; skew 500 ppm
I20250902 21:32:19.167110 6519 webserver.cc:480] Webserver started at http://127.4.52.196:36885/ using document root <none> and password file <none>
I20250902 21:32:19.167276 6519 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:19.167331 6519 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:19.168371 6519 fs_manager.cc:714] Time spent opening directory manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:19.168946 6534 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:19.169138 6519 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:19.169215 6519 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:19.169512 6519 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:19.183789 6519 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:19.184048 6519 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:19.184157 6519 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:19.184365 6519 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:19.184796 6541 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:19.185568 6519 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:19.185622 6519 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:19.185655 6519 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:19.186336 6519 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:19.186372 6519 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:19.186408 6541 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap starting.
W20250902 21:32:19.191208 5864 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59178: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:19.193095 6519 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.196:35541
I20250902 21:32:19.193228 6648 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.196:35541 every 8 connection(s)
I20250902 21:32:19.193451 6519 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
I20250902 21:32:19.198454 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 6519
I20250902 21:32:19.198549 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 5822
I20250902 21:32:19.204047 6649 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:19.204147 6649 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:19.204352 6649 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:19.204902 5495 ts_manager.cc:194] Re-registered known tserver with Master: 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541)
I20250902 21:32:19.205423 5495 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.196:52927
I20250902 21:32:19.208179 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.195:46587
--local_ip_for_outbound_sockets=127.4.52.195
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=32789
--webserver_interface=127.4.52.195
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:19.240490 6541 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:19.241199 6274 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 1/1 log segments. Stats: ops{read=1625 overwritten=0 applied=1623 ignored=0} inserts{seen=81050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:19.241537 6274 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap complete.
I20250902 21:32:19.242834 6274 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent bootstrapping tablet: real 0.323s user 0.241s sys 0.075s
I20250902 21:32:19.243817 6274 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 2 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:19.244414 6274 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: b41910bc980f4cfdbcc6eb23e1084325, State: Initialized, Role: FOLLOWER
I20250902 21:32:19.244554 6274 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1623, Last appended: 2.1625, Last appended by leader: 1625, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:19.244771 6274 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent starting tablet: real 0.002s user 0.001s sys 0.000s
I20250902 21:32:19.244907 6382 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
W20250902 21:32:19.251403 6295 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58386: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.251403 6296 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58386: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.305924 6652 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:19.306069 6652 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:19.306099 6652 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:19.307369 6652 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:19.307430 6652 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.195
I20250902 21:32:19.308810 6652 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.195:46587
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.4.52.195
--webserver_port=32789
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.6652
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.195
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:19.309007 6652 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:19.309226 6652 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:19.312036 6664 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:19.312041 6661 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:19.312192 6652 server_base.cc:1047] running on GCE node
W20250902 21:32:19.312336 6662 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:19.312530 6652 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:19.312733 6652 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:19.313864 6652 hybrid_clock.cc:648] HybridClock initialized: now 1756848739313848 us; error 33 us; skew 500 ppm
I20250902 21:32:19.314973 6652 webserver.cc:480] Webserver started at http://127.4.52.195:32789/ using document root <none> and password file <none>
I20250902 21:32:19.315163 6652 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:19.315212 6652 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:19.316402 6652 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:19.317003 6670 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:19.317173 6652 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20250902 21:32:19.317235 6652 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
uuid: "f79e0a34bf4d4181aed51bd155010a25"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:19.317477 6652 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:19.328349 6652 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:19.328555 6652 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:19.328650 6652 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:19.328835 6652 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:19.329213 6677 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:19.329917 6652 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:19.329954 6652 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:19.329975 6652 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:19.330431 6652 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:19.330485 6652 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:19.330560 6677 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap starting.
I20250902 21:32:19.337260 6652 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.195:46587
I20250902 21:32:19.337657 6652 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
I20250902 21:32:19.337918 6784 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.195:46587 every 8 connection(s)
I20250902 21:32:19.343695 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 6652
I20250902 21:32:19.347126 6785 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:19.347219 6785 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:19.347432 6785 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
W20250902 21:32:19.347584 6296 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58386: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:19.348058 5495 ts_manager.cc:194] Re-registered known tserver with Master: f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:19.348541 5495 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.195:58173
I20250902 21:32:19.382128 6677 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Log is configured to *not* fsync() on all Append() calls
W20250902 21:32:19.389549 6296 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58386: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.450529 6296 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58386: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:19.491183 6656 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:19.491302 6656 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:19.491626 6656 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
W20250902 21:32:19.492940 6296 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58386: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:19.499480 6585 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 3 candidate_status { last_received { term: 2 index: 1625 } } ignore_live_leader: false dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" is_pre_election: true
W20250902 21:32:19.500408 6270 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 3 pre-election: Tablet error from VoteRequest() call to peer 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:19.500761 6739 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 3 candidate_status { last_received { term: 2 index: 1625 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
W20250902 21:32:19.501623 6268 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 3 pre-election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:19.501680 6268 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: b41910bc980f4cfdbcc6eb23e1084325; no voters: 3a35f7f28cb9438dbcfb3196e167fdc5, f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:19.501808 6656 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20250902 21:32:19.502084 6541 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 1/1 log segments. Stats: ops{read=1625 overwritten=0 applied=1623 ignored=0} inserts{seen=81050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:19.502591 6541 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap complete.
I20250902 21:32:19.504210 6541 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent bootstrapping tablet: real 0.318s user 0.266s sys 0.047s
I20250902 21:32:19.505157 6541 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 2 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:19.505911 6541 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Initialized, Role: FOLLOWER
I20250902 21:32:19.506042 6541 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1623, Last appended: 2.1625, Last appended by leader: 1625, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:19.506229 6583 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:19.506275 6541 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent starting tablet: real 0.002s user 0.003s sys 0.001s
I20250902 21:32:19.506381 6649 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:19.510532 6316 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:19.511631 6450 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:19.515950 6712 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
W20250902 21:32:19.524608 6545 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.528188 6545 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.564365 6296 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58386: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.604076 6296 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58386: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.605723 6545 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:19.660086 6677 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 1/1 log segments. Stats: ops{read=1625 overwritten=0 applied=1624 ignored=0} inserts{seen=81100 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20250902 21:32:19.660437 6677 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap complete.
I20250902 21:32:19.661549 6677 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent bootstrapping tablet: real 0.331s user 0.276s sys 0.048s
I20250902 21:32:19.662371 6677 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 2 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:19.662969 6677 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: f79e0a34bf4d4181aed51bd155010a25, State: Initialized, Role: FOLLOWER
I20250902 21:32:19.663084 6677 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1624, Last appended: 2.1625, Last appended by leader: 1625, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:19.663306 6677 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent starting tablet: real 0.002s user 0.004s sys 0.000s
I20250902 21:32:19.663349 6785 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
W20250902 21:32:19.686796 6296 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58386: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.688380 6545 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.729547 6545 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.729667 6296 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58386: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:19.737771 6825 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:19.737901 6825 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:19.738155 6825 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:19.741474 6739 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 3 candidate_status { last_received { term: 2 index: 1625 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
I20250902 21:32:19.741636 6739 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 2.
I20250902 21:32:19.741588 6336 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 3 candidate_status { last_received { term: 2 index: 1625 } } ignore_live_leader: false dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325" is_pre_election: true
I20250902 21:32:19.741699 6336 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 2.
I20250902 21:32:19.741854 6535 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, f79e0a34bf4d4181aed51bd155010a25; no voters:
I20250902 21:32:19.741989 6825 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 2 FOLLOWER]: Leader pre-election won for term 3
I20250902 21:32:19.742077 6825 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 2 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:19.742108 6825 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 2 FOLLOWER]: Advancing to term 3
I20250902 21:32:19.743026 6825 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 3 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:19.743129 6825 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 3 election: Requested vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:19.743292 6739 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 3 candidate_status { last_received { term: 2 index: 1625 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
I20250902 21:32:19.743338 6336 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 3 candidate_status { last_received { term: 2 index: 1625 } } ignore_live_leader: false dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
I20250902 21:32:19.743399 6336 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 2 FOLLOWER]: Advancing to term 3
I20250902 21:32:19.743383 6739 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 2 FOLLOWER]: Advancing to term 3
I20250902 21:32:19.744362 6336 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 3.
I20250902 21:32:19.744362 6739 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 3.
I20250902 21:32:19.744540 6535 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, f79e0a34bf4d4181aed51bd155010a25; no voters:
I20250902 21:32:19.744632 6825 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 3 FOLLOWER]: Leader election won for term 3
I20250902 21:32:19.744724 6825 raft_consensus.cc:695] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 3 LEADER]: Becoming Leader. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Running, Role: LEADER
I20250902 21:32:19.744800 6825 consensus_queue.cc:237] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1623, Committed index: 1623, Last appended: 2.1625, Last appended by leader: 1625, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:19.745395 5495 catalog_manager.cc:5582] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 reported cstate change: term changed from 2 to 3, leader changed from b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194) to 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196). New cstate: current_term: 3 leader_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } health_report { overall_health: UNKNOWN } } }
W20250902 21:32:19.770469 6694 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:19.774147 6694 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:19.815150 6336 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 3 FOLLOWER]: Refusing update from remote peer 3a35f7f28cb9438dbcfb3196e167fdc5: Log matching property violated. Preceding OpId in replica: term: 2 index: 1625. Preceding OpId from leader: term: 3 index: 1627. (index mismatch)
I20250902 21:32:19.815150 6739 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 3 FOLLOWER]: Refusing update from remote peer 3a35f7f28cb9438dbcfb3196e167fdc5: Log matching property violated. Preceding OpId in replica: term: 2 index: 1625. Preceding OpId from leader: term: 3 index: 1627. (index mismatch)
I20250902 21:32:19.815460 6830 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1626, Last known committed idx: 1624, Time since last communication: 0.000s
I20250902 21:32:19.815595 6825 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1626, Last known committed idx: 1623, Time since last communication: 0.000s
I20250902 21:32:19.817817 6836 mvcc.cc:204] Tried to move back new op lower bound from 7196052438280454144 to 7196052437995204608. Current Snapshot: MvccSnapshot[applied={T|T < 7196052434175447040}]
W20250902 21:32:19.925580 6121 scanner-internal.cc:458] Time spent opening tablet: real 2.309s user 0.001s sys 0.000s
W20250902 21:32:19.925840 6122 scanner-internal.cc:458] Time spent opening tablet: real 2.309s user 0.001s sys 0.000s
W20250902 21:32:19.925849 6123 scanner-internal.cc:458] Time spent opening tablet: real 2.309s user 0.001s sys 0.000s
I20250902 21:32:20.064818 6516 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:24.781174 6316 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:24.783106 6450 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:24.789229 6583 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250902 21:32:24.804246 6712 tablet_service.cc:1430] Tablet server has 0 leaders and 3 scanners
I20250902 21:32:25.180284 5497 ts_manager.cc:284] Unset tserver state for 3a35f7f28cb9438dbcfb3196e167fdc5 from MAINTENANCE_MODE
I20250902 21:32:25.183614 5495 ts_manager.cc:284] Unset tserver state for b41910bc980f4cfdbcc6eb23e1084325 from MAINTENANCE_MODE
I20250902 21:32:25.192058 5495 ts_manager.cc:284] Unset tserver state for f79e0a34bf4d4181aed51bd155010a25 from MAINTENANCE_MODE
I20250902 21:32:25.215797 5495 ts_manager.cc:284] Unset tserver state for 6311ff5fc63a49108dc3000117399229 from MAINTENANCE_MODE
I20250902 21:32:25.620035 5495 ts_manager.cc:295] Set tserver state for 6311ff5fc63a49108dc3000117399229 to MAINTENANCE_MODE
I20250902 21:32:25.636595 5495 ts_manager.cc:295] Set tserver state for b41910bc980f4cfdbcc6eb23e1084325 to MAINTENANCE_MODE
I20250902 21:32:25.645347 5495 ts_manager.cc:295] Set tserver state for 3a35f7f28cb9438dbcfb3196e167fdc5 to MAINTENANCE_MODE
I20250902 21:32:25.660604 5495 ts_manager.cc:295] Set tserver state for f79e0a34bf4d4181aed51bd155010a25 to MAINTENANCE_MODE
I20250902 21:32:25.820995 6382 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:25.822114 6785 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:25.822845 6649 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:25.959126 6450 tablet_service.cc:1423] Tablet server 6311ff5fc63a49108dc3000117399229 set to quiescing
I20250902 21:32:25.959197 6450 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:26.015453 6712 tablet_service.cc:1423] Tablet server f79e0a34bf4d4181aed51bd155010a25 set to quiescing
I20250902 21:32:26.015511 6712 tablet_service.cc:1430] Tablet server has 0 leaders and 3 scanners
I20250902 21:32:26.040411 6583 tablet_service.cc:1423] Tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 set to quiescing
I20250902 21:32:26.040478 6583 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250902 21:32:26.048969 6830 raft_consensus.cc:991] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: : Instructing follower b41910bc980f4cfdbcc6eb23e1084325 to start an election
I20250902 21:32:26.049036 6830 raft_consensus.cc:1079] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 3 LEADER]: Signalling peer b41910bc980f4cfdbcc6eb23e1084325 to start an election
I20250902 21:32:26.049490 6335 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b"
dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
from {username='slave'} at 127.4.52.196:51213
I20250902 21:32:26.049593 6335 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 3 FOLLOWER]: Starting forced leader election (received explicit request)
I20250902 21:32:26.049638 6335 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 3 FOLLOWER]: Advancing to term 4
I20250902 21:32:26.049882 6849 raft_consensus.cc:991] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: : Instructing follower b41910bc980f4cfdbcc6eb23e1084325 to start an election
I20250902 21:32:26.049924 6849 raft_consensus.cc:1079] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 3 LEADER]: Signalling peer b41910bc980f4cfdbcc6eb23e1084325 to start an election
I20250902 21:32:26.050398 6335 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 4 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:26.050536 6335 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 4 election: Requested vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:26.050808 6585 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 4 candidate_status { last_received { term: 3 index: 7706 } } ignore_live_leader: true dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
I20250902 21:32:26.050843 6739 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 4 candidate_status { last_received { term: 3 index: 7706 } } ignore_live_leader: true dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
I20250902 21:32:26.050851 6335 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b"
dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
from {username='slave'} at 127.4.52.196:51213
I20250902 21:32:26.050907 6335 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 4 FOLLOWER]: Starting forced leader election (received explicit request)
I20250902 21:32:26.050915 6585 raft_consensus.cc:3053] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 3 LEADER]: Stepping down as leader of term 3
I20250902 21:32:26.050938 6335 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 4 FOLLOWER]: Advancing to term 5
I20250902 21:32:26.050940 6585 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 3 LEADER]: Becoming Follower/Learner. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Running, Role: LEADER
I20250902 21:32:26.051004 6585 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 7706, Committed index: 7706, Last appended: 3.7706, Last appended by leader: 7706, Current term: 3, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:26.051079 6585 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 3 FOLLOWER]: Advancing to term 4
I20250902 21:32:26.051635 6335 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:26.051755 6335 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 5 election: Requested vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:26.052047 6585 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 4 FOLLOWER]: Leader election vote request: Granting yes vote for candidate b41910bc980f4cfdbcc6eb23e1084325 in term 4.
I20250902 21:32:26.052217 6336 raft_consensus.cc:1238] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Rejecting Update request from peer 3a35f7f28cb9438dbcfb3196e167fdc5 for earlier term 3. Current term is 5. Ops: []
I20250902 21:32:26.052485 6270 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 4 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, b41910bc980f4cfdbcc6eb23e1084325; no voters: f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:26.052528 6585 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 5 candidate_status { last_received { term: 3 index: 7706 } } ignore_live_leader: true dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
I20250902 21:32:26.052603 6585 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 4 FOLLOWER]: Advancing to term 5
I20250902 21:32:26.052565 6739 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 5 candidate_status { last_received { term: 3 index: 7706 } } ignore_live_leader: true dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
I20250902 21:32:26.052642 6739 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 3 FOLLOWER]: Advancing to term 5
W20250902 21:32:26.053161 6824 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: Cannot assign timestamp to op. Tablet is not in leader mode. Last heard from a leader: 0.003s ago.
I20250902 21:32:26.053298 6585 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 5 FOLLOWER]: Leader election vote request: Granting yes vote for candidate b41910bc980f4cfdbcc6eb23e1084325 in term 5.
I20250902 21:32:26.053402 6739 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 5 FOLLOWER]: Leader election vote request: Granting yes vote for candidate b41910bc980f4cfdbcc6eb23e1084325 in term 5.
I20250902 21:32:26.053592 6268 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 5 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: b41910bc980f4cfdbcc6eb23e1084325, f79e0a34bf4d4181aed51bd155010a25; no voters:
I20250902 21:32:26.053725 6316 tablet_service.cc:1423] Tablet server b41910bc980f4cfdbcc6eb23e1084325 set to quiescing
I20250902 21:32:26.053791 6316 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:26.053936 7037 raft_consensus.cc:2762] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Leader election decision vote started in defunct term 4: won
W20250902 21:32:26.053436 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:26.053627 6824 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: Cannot assign timestamp to op. Tablet is not in leader mode. Last heard from a leader: 0.003s ago.
I20250902 21:32:26.054262 7036 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Leader election won for term 5
I20250902 21:32:26.054495 7036 raft_consensus.cc:695] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 LEADER]: Becoming Leader. State: Replica: b41910bc980f4cfdbcc6eb23e1084325, State: Running, Role: LEADER
I20250902 21:32:26.054677 7036 consensus_queue.cc:237] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 7705, Committed index: 7705, Last appended: 3.7706, Last appended by leader: 7706, Current term: 5, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:26.055297 5495 catalog_manager.cc:5582] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 reported cstate change: term changed from 3 to 5, leader changed from 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196) to b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194). New cstate: current_term: 5 leader_uuid: "b41910bc980f4cfdbcc6eb23e1084325" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } health_report { overall_health: HEALTHY } } }
W20250902 21:32:26.056087 6685 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:26.056123 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:26.057878 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:26.060242 6585 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 5 FOLLOWER]: Refusing update from remote peer b41910bc980f4cfdbcc6eb23e1084325: Log matching property violated. Preceding OpId in replica: term: 3 index: 7706. Preceding OpId from leader: term: 5 index: 7708. (index mismatch)
I20250902 21:32:26.060410 6739 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 5 FOLLOWER]: Refusing update from remote peer b41910bc980f4cfdbcc6eb23e1084325: Log matching property violated. Preceding OpId in replica: term: 3 index: 7706. Preceding OpId from leader: term: 5 index: 7708. (index mismatch)
I20250902 21:32:26.060732 7037 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 7707, Last known committed idx: 7706, Time since last communication: 0.000s
I20250902 21:32:26.060966 7036 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 7707, Last known committed idx: 7705, Time since last communication: 0.000s
I20250902 21:32:26.062109 6824 mvcc.cc:204] Tried to move back new op lower bound from 7196052463864287232 to 7196052463840600064. Current Snapshot: MvccSnapshot[applied={T|T < 7196052463860563968}]
I20250902 21:32:26.064011 6826 mvcc.cc:204] Tried to move back new op lower bound from 7196052463864287232 to 7196052463840600064. Current Snapshot: MvccSnapshot[applied={T|T < 7196052463864287232}]
I20250902 21:32:26.068990 6516 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:27.253947 6712 tablet_service.cc:1423] Tablet server f79e0a34bf4d4181aed51bd155010a25 set to quiescing
I20250902 21:32:27.254024 6712 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:27.307281 6583 tablet_service.cc:1423] Tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 set to quiescing
I20250902 21:32:27.307343 6583 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:27.364548 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 6252
W20250902 21:32:27.373979 6092 connection.cc:537] client connection to 127.4.52.194:33503 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:27.373984 6673 connection.cc:537] server connection from 127.4.52.194:57043 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:27.374150 6092 meta_cache.cc:302] tablet 1bd11ec67527495b831ab711f8a2f39b: replica b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503) has failed: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20250902 21:32:27.374193 6123 meta_cache.cc:1510] marking tablet server b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503) as failed
I20250902 21:32:27.375044 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.194:33503
--local_ip_for_outbound_sockets=127.4.52.194
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=37969
--webserver_interface=127.4.52.194
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:27.377530 6122 meta_cache.cc:1510] marking tablet server b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503) as failed
W20250902 21:32:27.377776 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.380452 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.384161 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.384574 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.385375 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.387919 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.397092 6558 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.397086 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.399107 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.404805 6685 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.404808 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.407728 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.420012 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.424098 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.427199 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.427647 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.432834 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.435972 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.447288 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.455051 7067 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:27.455193 7067 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:27.455210 7067 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:27.456462 7067 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:27.456512 7067 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.194
I20250902 21:32:27.457947 7067 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.194:33503
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.4.52.194
--webserver_port=37969
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.7067
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.194
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:27.458184 7067 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
W20250902 21:32:27.458292 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.458292 6558 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:27.458431 7067 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:27.461220 7076 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:27.461275 7073 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:27.461253 7067 server_base.cc:1047] running on GCE node
W20250902 21:32:27.461417 7074 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:27.461598 7067 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:27.461802 7067 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
W20250902 21:32:27.461925 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:27.462967 7067 hybrid_clock.cc:648] HybridClock initialized: now 1756848747462942 us; error 34 us; skew 500 ppm
I20250902 21:32:27.464035 7067 webserver.cc:480] Webserver started at http://127.4.52.194:37969/ using document root <none> and password file <none>
I20250902 21:32:27.464210 7067 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:27.464243 7067 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:27.465266 7067 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:27.465828 7082 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:27.465975 7067 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:27.466041 7067 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
uuid: "b41910bc980f4cfdbcc6eb23e1084325"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:27.466274 7067 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20250902 21:32:27.472113 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.473112 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:27.477826 7067 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:27.478061 7067 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:27.478164 7067 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:27.478377 7067 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:27.478722 7089 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:27.479498 7067 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:27.479538 7067 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:27.479571 7067 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:27.480084 7067 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:27.480119 7067 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:27.480157 7089 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap starting.
I20250902 21:32:27.485800 7067 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.194:33503
I20250902 21:32:27.485862 7196 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.194:33503 every 8 connection(s)
I20250902 21:32:27.486155 7067 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
I20250902 21:32:27.489909 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 7067
I20250902 21:32:27.490010 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 6387
I20250902 21:32:27.495868 7197 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:27.495971 7197 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:27.496187 7197 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:27.496730 5497 ts_manager.cc:194] Re-registered known tserver with Master: b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:27.496979 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.193:34353
--local_ip_for_outbound_sockets=127.4.52.193
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=43863
--webserver_interface=127.4.52.193
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:27.497270 5497 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.194:48241
W20250902 21:32:27.511338 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.511338 6558 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.526544 6558 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.527065 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.529157 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.545261 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.564565 6558 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.564565 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.580736 6558 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.582369 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.583400 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:27.589064 7089 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Log is configured to *not* fsync() on all Append() calls
W20250902 21:32:27.599756 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.602546 7202 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:27.602706 7202 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:27.602737 7202 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:27.604370 7202 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:27.604429 7202 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.193
I20250902 21:32:27.606554 7202 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.193:34353
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.4.52.193
--webserver_port=43863
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.7202
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.193
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:27.606804 7202 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:27.607015 7202 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:27.609297 7209 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:27.609426 7211 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:27.609319 7208 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:27.609697 7202 server_base.cc:1047] running on GCE node
I20250902 21:32:27.609884 7202 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:27.610095 7202 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:27.611251 7202 hybrid_clock.cc:648] HybridClock initialized: now 1756848747611241 us; error 29 us; skew 500 ppm
I20250902 21:32:27.612645 7202 webserver.cc:480] Webserver started at http://127.4.52.193:43863/ using document root <none> and password file <none>
I20250902 21:32:27.612869 7202 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:27.612926 7202 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:27.614374 7202 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:27.615007 7217 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:27.615168 7202 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:27.615244 7202 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
uuid: "6311ff5fc63a49108dc3000117399229"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:27.615535 7202 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20250902 21:32:27.625998 6558 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.626176 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:27.643914 7202 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:27.644193 7202 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:27.644308 7202 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:27.644533 7202 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:27.644881 7202 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:27.644956 7202 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:27.644996 7202 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:27.645021 7202 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
W20250902 21:32:27.645145 6563 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:49020: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.647755 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.650887 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:27.651444 7202 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.193:34353
I20250902 21:32:27.651510 7330 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.193:34353 every 8 connection(s)
I20250902 21:32:27.651830 7202 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
I20250902 21:32:27.652213 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 7202
I20250902 21:32:27.652331 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 6519
I20250902 21:32:27.659250 7331 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:27.659387 7331 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:27.659637 7331 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:27.660043 5497 ts_manager.cc:194] Re-registered known tserver with Master: 6311ff5fc63a49108dc3000117399229 (127.4.52.193:34353)
I20250902 21:32:27.660513 5497 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.193:46721
I20250902 21:32:27.662068 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.196:35541
--local_ip_for_outbound_sockets=127.4.52.196
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=36885
--webserver_interface=127.4.52.196
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:27.664721 7335 raft_consensus.cc:668] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: failed to trigger leader election: Illegal state: leader elections are disabled
W20250902 21:32:27.666077 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.716578 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.724871 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.726965 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:27.762741 7334 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:27.762933 7334 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:27.762953 7334 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:27.764249 7334 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:27.764297 7334 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.196
I20250902 21:32:27.765570 7334 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.196:35541
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.4.52.196
--webserver_port=36885
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.7334
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.196
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:27.765762 7334 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:27.765980 7334 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:27.768232 7340 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:27.768265 7343 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:27.768267 7341 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:27.768380 7334 server_base.cc:1047] running on GCE node
I20250902 21:32:27.768549 7334 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:27.768738 7334 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:27.769856 7334 hybrid_clock.cc:648] HybridClock initialized: now 1756848747769843 us; error 29 us; skew 500 ppm
I20250902 21:32:27.771076 7334 webserver.cc:480] Webserver started at http://127.4.52.196:36885/ using document root <none> and password file <none>
I20250902 21:32:27.771251 7334 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:27.771296 7334 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:27.772418 7334 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:27.773023 7349 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:27.773221 7334 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20250902 21:32:27.773303 7334 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:27.773610 7334 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:27.778957 6122 meta_cache.cc:1510] marking tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541) as failed
I20250902 21:32:27.778957 6123 meta_cache.cc:1510] marking tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541) as failed
W20250902 21:32:27.785157 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:27.792289 7334 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:27.792523 7334 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:27.792645 7334 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:27.792873 7334 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:27.793313 7356 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:27.794015 7334 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:27.794064 7334 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:27.794093 7334 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:27.794735 7334 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:27.794824 7356 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap starting.
I20250902 21:32:27.794826 7334 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.000s sys 0.001s
W20250902 21:32:27.796336 6699 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59212: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:27.801404 7334 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.196:35541
I20250902 21:32:27.801492 7463 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.196:35541 every 8 connection(s)
I20250902 21:32:27.801779 7334 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
I20250902 21:32:27.807049 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 7334
I20250902 21:32:27.807178 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 6652
I20250902 21:32:27.811661 7464 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:27.811755 7464 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:27.811941 7464 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:27.812541 5497 ts_manager.cc:194] Re-registered known tserver with Master: 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541)
I20250902 21:32:27.813060 5497 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.196:54685
W20250902 21:32:27.818286 6092 connection.cc:537] client connection to 127.4.52.195:46587 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20250902 21:32:27.818626 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.195:46587
--local_ip_for_outbound_sockets=127.4.52.195
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=32789
--webserver_interface=127.4.52.195
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:27.906782 7356 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Log is configured to *not* fsync() on all Append() calls
W20250902 21:32:27.926528 7467 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:27.926725 7467 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:27.926771 7467 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:27.928891 7467 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:27.928982 7467 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.195
I20250902 21:32:27.931325 7467 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.195:46587
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.4.52.195
--webserver_port=32789
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.7467
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.195
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:27.931582 7467 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:27.931949 7467 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:27.934808 7475 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:27.934818 7474 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:27.935067 7477 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:27.935650 7467 server_base.cc:1047] running on GCE node
I20250902 21:32:27.935817 7467 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:27.936033 7467 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:27.939325 7467 hybrid_clock.cc:648] HybridClock initialized: now 1756848747938857 us; error 481 us; skew 500 ppm
I20250902 21:32:27.940794 7467 webserver.cc:480] Webserver started at http://127.4.52.195:32789/ using document root <none> and password file <none>
I20250902 21:32:27.941035 7467 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:27.941097 7467 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:27.942662 7467 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:27.943555 7483 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:27.943857 7467 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:27.943939 7467 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
uuid: "f79e0a34bf4d4181aed51bd155010a25"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:27.944275 7467 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:27.971652 7467 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:27.971939 7467 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:27.972054 7467 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:27.972324 7467 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:27.972831 7490 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:27.973974 7467 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:27.974023 7467 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:27.974056 7467 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:27.974732 7467 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:27.974821 7467 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:27.974900 7490 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap starting.
I20250902 21:32:27.980913 7467 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.195:46587
I20250902 21:32:27.981318 7467 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
I20250902 21:32:27.984001 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 7467
I20250902 21:32:27.992591 7597 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.195:46587 every 8 connection(s)
I20250902 21:32:28.006886 7598 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:28.006994 7598 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:28.007201 7598 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:28.007812 5497 ts_manager.cc:194] Re-registered known tserver with Master: f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:28.008312 5497 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.195:57001
I20250902 21:32:28.095775 7490 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:28.161384 7398 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:28.176049 7520 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:28.185252 7120 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:28.186597 7265 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:28.491663 7089 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 1/2 log segments. Stats: ops{read=4793 overwritten=0 applied=4792 ignored=0} inserts{seen=239450 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20250902 21:32:28.498236 7197 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:28.661387 7331 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:28.813769 7464 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:28.988709 7356 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 1/2 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:29.009538 7598 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:29.159742 7089 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 2/2 log segments. Stats: ops{read=8973 overwritten=0 applied=8971 ignored=0} inserts{seen=448350 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:29.160142 7089 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap complete.
I20250902 21:32:29.163326 7089 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent bootstrapping tablet: real 1.683s user 1.383s sys 0.287s
I20250902 21:32:29.164517 7089 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:29.165143 7089 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Becoming Follower/Learner. State: Replica: b41910bc980f4cfdbcc6eb23e1084325, State: Initialized, Role: FOLLOWER
I20250902 21:32:29.165338 7089 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 8971, Last appended: 5.8973, Last appended by leader: 8973, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:29.165550 7089 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent starting tablet: real 0.002s user 0.002s sys 0.000s
W20250902 21:32:29.219401 7109 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:29.222323 7109 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:29.249617 7490 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 1/2 log segments. Stats: ops{read=4771 overwritten=0 applied=4769 ignored=0} inserts{seen=238300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
W20250902 21:32:29.299654 7109 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:29.406244 7109 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:29.406286 7110 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:29.488225 7637 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:29.488353 7637 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:29.489218 7637 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 6 pre-election: Requested pre-vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
W20250902 21:32:29.489629 7110 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:29.493373 7418 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 6 candidate_status { last_received { term: 5 index: 8973 } } ignore_live_leader: false dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" is_pre_election: true
I20250902 21:32:29.493420 7552 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 6 candidate_status { last_received { term: 5 index: 8973 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
W20250902 21:32:29.494356 7085 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 6 pre-election: Tablet error from VoteRequest() call to peer 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541): Illegal state: must be running to vote when last-logged opid is not known
W20250902 21:32:29.494464 7083 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 6 pre-election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:29.494514 7083 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 6 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: b41910bc980f4cfdbcc6eb23e1084325; no voters: 3a35f7f28cb9438dbcfb3196e167fdc5, f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:29.494645 7637 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Leader pre-election lost for term 6. Reason: could not achieve majority
W20250902 21:32:29.601151 7109 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:29.601154 7110 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:29.609385 6121 scanner-internal.cc:458] Time spent opening tablet: real 2.407s user 0.001s sys 0.000s
W20250902 21:32:29.689668 7109 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:29.781613 6122 scanner-internal.cc:458] Time spent opening tablet: real 2.408s user 0.002s sys 0.001s
W20250902 21:32:29.781975 6123 scanner-internal.cc:458] Time spent opening tablet: real 2.418s user 0.001s sys 0.000s
W20250902 21:32:29.798270 7110 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:29.805770 7110 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:29.851007 7637 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:29.851094 7637 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:29.851209 7637 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 6 pre-election: Requested pre-vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:29.851492 7552 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 6 candidate_status { last_received { term: 5 index: 8973 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
W20250902 21:32:29.851711 7083 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 6 pre-election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:29.852284 7418 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 6 candidate_status { last_received { term: 5 index: 8973 } } ignore_live_leader: false dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" is_pre_election: true
W20250902 21:32:29.853219 7085 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 6 pre-election: Tablet error from VoteRequest() call to peer 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:29.853322 7085 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 6 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: b41910bc980f4cfdbcc6eb23e1084325; no voters: 3a35f7f28cb9438dbcfb3196e167fdc5, f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:29.853432 7637 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Leader pre-election lost for term 6. Reason: could not achieve majority
I20250902 21:32:29.894217 7356 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 2/2 log segments. Stats: ops{read=8973 overwritten=0 applied=8973 ignored=0} inserts{seen=448450 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250902 21:32:29.894717 7356 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap complete.
W20250902 21:32:29.895404 7109 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:29.899127 7356 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent bootstrapping tablet: real 2.104s user 1.842s sys 0.255s
I20250902 21:32:29.900327 7356 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 5 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:29.900573 7356 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 5 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Initialized, Role: FOLLOWER
I20250902 21:32:29.900712 7356 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 8973, Last appended: 5.8973, Last appended by leader: 8973, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:29.900981 7356 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent starting tablet: real 0.002s user 0.001s sys 0.000s
W20250902 21:32:30.010141 7108 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:30.015105 7109 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:30.039057 7373 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59770: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:30.081794 7373 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59770: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:30.085511 7373 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59770: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:30.112568 7108 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:30.165652 7490 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 2/2 log segments. Stats: ops{read=8973 overwritten=0 applied=8971 ignored=0} inserts{seen=448350 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:30.166210 7490 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap complete.
I20250902 21:32:30.170431 7490 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent bootstrapping tablet: real 2.196s user 1.912s sys 0.260s
I20250902 21:32:30.171710 7490 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 5 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:30.172554 7490 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 5 FOLLOWER]: Becoming Follower/Learner. State: Replica: f79e0a34bf4d4181aed51bd155010a25, State: Initialized, Role: FOLLOWER
I20250902 21:32:30.172691 7490 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 8971, Last appended: 5.8973, Last appended by leader: 8973, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:30.172923 7490 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent starting tablet: real 0.002s user 0.002s sys 0.000s
W20250902 21:32:30.186303 7495 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:30.230062 7109 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:30.230964 7109 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48022: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:30.245368 7644 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 5 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:30.245499 7644 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 5 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:30.245790 7644 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 6 pre-election: Requested pre-vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:30.249575 7552 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 6 candidate_status { last_received { term: 5 index: 8973 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
I20250902 21:32:30.249715 7552 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 5 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 5.
I20250902 21:32:30.249887 7350 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 6 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, f79e0a34bf4d4181aed51bd155010a25; no voters:
I20250902 21:32:30.250028 7644 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 5 FOLLOWER]: Leader pre-election won for term 6
I20250902 21:32:30.250110 7644 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 5 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:30.250146 7644 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 5 FOLLOWER]: Advancing to term 6
I20250902 21:32:30.250511 7151 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 6 candidate_status { last_received { term: 5 index: 8973 } } ignore_live_leader: false dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325" is_pre_election: true
I20250902 21:32:30.250625 7151 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 5.
I20250902 21:32:30.251350 7644 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 6 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:30.251475 7644 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 6 election: Requested vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:30.251744 7552 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 6 candidate_status { last_received { term: 5 index: 8973 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
I20250902 21:32:30.251822 7552 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 5 FOLLOWER]: Advancing to term 6
I20250902 21:32:30.251855 7151 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 6 candidate_status { last_received { term: 5 index: 8973 } } ignore_live_leader: false dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
I20250902 21:32:30.251924 7151 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 5 FOLLOWER]: Advancing to term 6
I20250902 21:32:30.253006 7552 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 6 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 6.
I20250902 21:32:30.253154 7151 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 6 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 6.
I20250902 21:32:30.253154 7350 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 6 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, f79e0a34bf4d4181aed51bd155010a25; no voters:
I20250902 21:32:30.253230 7644 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 6 FOLLOWER]: Leader election won for term 6
I20250902 21:32:30.253335 7644 raft_consensus.cc:695] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 6 LEADER]: Becoming Leader. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Running, Role: LEADER
I20250902 21:32:30.253422 7644 consensus_queue.cc:237] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 8973, Committed index: 8973, Last appended: 5.8973, Last appended by leader: 8973, Current term: 6, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:30.254191 5495 catalog_manager.cc:5582] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 reported cstate change: term changed from 5 to 6, leader changed from b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194) to 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196). New cstate: current_term: 6 leader_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:30.264065 7151 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 6 FOLLOWER]: Refusing update from remote peer 3a35f7f28cb9438dbcfb3196e167fdc5: Log matching property violated. Preceding OpId in replica: term: 5 index: 8973. Preceding OpId from leader: term: 6 index: 8975. (index mismatch)
I20250902 21:32:30.264333 7552 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 6 FOLLOWER]: Refusing update from remote peer 3a35f7f28cb9438dbcfb3196e167fdc5: Log matching property violated. Preceding OpId in replica: term: 5 index: 8973. Preceding OpId from leader: term: 6 index: 8975. (index mismatch)
I20250902 21:32:30.264351 7644 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 8974, Last known committed idx: 8971, Time since last communication: 0.000s
I20250902 21:32:30.264627 7644 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 8974, Last known committed idx: 8971, Time since last communication: 0.000s
I20250902 21:32:30.266611 7657 mvcc.cc:204] Tried to move back new op lower bound from 7196052481079783424 to 7196052481038462976. Current Snapshot: MvccSnapshot[applied={T|T < 7196052481079783424}]
I20250902 21:32:30.267499 7645 mvcc.cc:204] Tried to move back new op lower bound from 7196052481079783424 to 7196052481038462976. Current Snapshot: MvccSnapshot[applied={T|T < 7196052481079783424}]
I20250902 21:32:30.267895 7661 mvcc.cc:204] Tried to move back new op lower bound from 7196052481079783424 to 7196052481038462976. Current Snapshot: MvccSnapshot[applied={T|T < 7196052481079783424 or (T in {7196052481079783424})}]
I20250902 21:32:33.405848 7398 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250902 21:32:33.408279 7120 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:33.421164 7265 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:33.435226 7520 tablet_service.cc:1430] Tablet server has 0 leaders and 3 scanners
I20250902 21:32:33.789331 5495 ts_manager.cc:284] Unset tserver state for 6311ff5fc63a49108dc3000117399229 from MAINTENANCE_MODE
I20250902 21:32:33.853976 5495 ts_manager.cc:284] Unset tserver state for f79e0a34bf4d4181aed51bd155010a25 from MAINTENANCE_MODE
I20250902 21:32:33.931932 5495 ts_manager.cc:284] Unset tserver state for 3a35f7f28cb9438dbcfb3196e167fdc5 from MAINTENANCE_MODE
I20250902 21:32:33.945933 5495 ts_manager.cc:284] Unset tserver state for b41910bc980f4cfdbcc6eb23e1084325 from MAINTENANCE_MODE
I20250902 21:32:34.270125 7197 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:34.270635 7464 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:34.275745 7598 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:34.344663 5495 ts_manager.cc:295] Set tserver state for 6311ff5fc63a49108dc3000117399229 to MAINTENANCE_MODE
I20250902 21:32:34.359680 5495 ts_manager.cc:295] Set tserver state for f79e0a34bf4d4181aed51bd155010a25 to MAINTENANCE_MODE
I20250902 21:32:34.375771 5495 ts_manager.cc:295] Set tserver state for 3a35f7f28cb9438dbcfb3196e167fdc5 to MAINTENANCE_MODE
I20250902 21:32:34.376315 5497 ts_manager.cc:295] Set tserver state for b41910bc980f4cfdbcc6eb23e1084325 to MAINTENANCE_MODE
I20250902 21:32:34.665357 7331 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:34.703519 7265 tablet_service.cc:1423] Tablet server 6311ff5fc63a49108dc3000117399229 set to quiescing
I20250902 21:32:34.703598 7265 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:34.708012 7398 tablet_service.cc:1423] Tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 set to quiescing
I20250902 21:32:34.708068 7398 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250902 21:32:34.710352 7651 raft_consensus.cc:991] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: : Instructing follower f79e0a34bf4d4181aed51bd155010a25 to start an election
I20250902 21:32:34.710417 7651 raft_consensus.cc:1079] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 6 LEADER]: Signalling peer f79e0a34bf4d4181aed51bd155010a25 to start an election
I20250902 21:32:34.711162 7552 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b"
dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
from {username='slave'} at 127.4.52.196:57001
I20250902 21:32:34.711287 7552 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 6 FOLLOWER]: Starting forced leader election (received explicit request)
I20250902 21:32:34.711323 7552 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 6 FOLLOWER]: Advancing to term 7
I20250902 21:32:34.712167 7552 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 7 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:34.712384 7552 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [CANDIDATE]: Term 7 election: Requested vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:34.712558 7779 raft_consensus.cc:991] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: : Instructing follower b41910bc980f4cfdbcc6eb23e1084325 to start an election
I20250902 21:32:34.712602 7779 raft_consensus.cc:1079] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 6 LEADER]: Signalling peer b41910bc980f4cfdbcc6eb23e1084325 to start an election
I20250902 21:32:34.713501 7151 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b"
dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
from {username='slave'} at 127.4.52.196:34469
I20250902 21:32:34.713585 7151 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 6 FOLLOWER]: Starting forced leader election (received explicit request)
I20250902 21:32:34.713615 7151 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 6 FOLLOWER]: Advancing to term 7
I20250902 21:32:34.713641 7552 raft_consensus.cc:1238] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 7 FOLLOWER]: Rejecting Update request from peer 3a35f7f28cb9438dbcfb3196e167fdc5 for earlier term 6. Current term is 7. Ops: [6.13090-6.13091]
I20250902 21:32:34.714465 7151 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 7 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:34.714591 7151 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 7 election: Requested vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:34.714782 7552 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 7 candidate_status { last_received { term: 6 index: 13091 } } ignore_live_leader: true dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
I20250902 21:32:34.714859 7552 raft_consensus.cc:2391] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 7 FOLLOWER]: Leader election vote request: Denying vote to candidate b41910bc980f4cfdbcc6eb23e1084325 in current term 7: Already voted for candidate f79e0a34bf4d4181aed51bd155010a25 in this term.
I20250902 21:32:34.715314 7418 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 7 candidate_status { last_received { term: 6 index: 13091 } } ignore_live_leader: true dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
I20250902 21:32:34.715395 7418 raft_consensus.cc:3053] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 6 LEADER]: Stepping down as leader of term 6
I20250902 21:32:34.715421 7418 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 6 LEADER]: Becoming Follower/Learner. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Running, Role: LEADER
I20250902 21:32:34.715430 7150 raft_consensus.cc:1238] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 7 FOLLOWER]: Rejecting Update request from peer 3a35f7f28cb9438dbcfb3196e167fdc5 for earlier term 6. Current term is 7. Ops: []
I20250902 21:32:34.715466 7418 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 13091, Committed index: 13091, Last appended: 6.13094, Last appended by leader: 13094, Current term: 6, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:34.715523 7418 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 6 FOLLOWER]: Advancing to term 7
I20250902 21:32:34.716353 7418 raft_consensus.cc:2408] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Leader election vote request: Denying vote to candidate b41910bc980f4cfdbcc6eb23e1084325 for term 7 because replica has last-logged OpId of term: 6 index: 13094, which is greater than that of the candidate, which has last-logged OpId of term: 6 index: 13091.
I20250902 21:32:34.716539 7085 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 7 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: b41910bc980f4cfdbcc6eb23e1084325; no voters: 3a35f7f28cb9438dbcfb3196e167fdc5, f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:34.717026 7843 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 7 FOLLOWER]: Leader election lost for term 7. Reason: could not achieve majority
I20250902 21:32:34.719179 7150 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "f79e0a34bf4d4181aed51bd155010a25" candidate_term: 7 candidate_status { last_received { term: 6 index: 13089 } } ignore_live_leader: true dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
I20250902 21:32:34.719251 7150 raft_consensus.cc:2391] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 7 FOLLOWER]: Leader election vote request: Denying vote to candidate f79e0a34bf4d4181aed51bd155010a25 in current term 7: Already voted for candidate b41910bc980f4cfdbcc6eb23e1084325 in this term.
I20250902 21:32:34.719882 7418 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "f79e0a34bf4d4181aed51bd155010a25" candidate_term: 7 candidate_status { last_received { term: 6 index: 13089 } } ignore_live_leader: true dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
I20250902 21:32:34.719959 7418 raft_consensus.cc:2408] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Leader election vote request: Denying vote to candidate f79e0a34bf4d4181aed51bd155010a25 for term 7 because replica has last-logged OpId of term: 6 index: 13094, which is greater than that of the candidate, which has last-logged OpId of term: 6 index: 13089.
I20250902 21:32:34.720124 7486 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [CANDIDATE]: Term 7 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: f79e0a34bf4d4181aed51bd155010a25; no voters: 3a35f7f28cb9438dbcfb3196e167fdc5, b41910bc980f4cfdbcc6eb23e1084325
I20250902 21:32:34.720332 7844 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 7 FOLLOWER]: Leader election lost for term 7. Reason: could not achieve majority
I20250902 21:32:34.741508 7520 tablet_service.cc:1423] Tablet server f79e0a34bf4d4181aed51bd155010a25 set to quiescing
I20250902 21:32:34.741565 7520 tablet_service.cc:1430] Tablet server has 0 leaders and 3 scanners
I20250902 21:32:34.755892 7120 tablet_service.cc:1423] Tablet server b41910bc980f4cfdbcc6eb23e1084325 set to quiescing
I20250902 21:32:34.755950 7120 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
W20250902 21:32:34.989089 7672 raft_consensus.cc:668] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: failed to trigger leader election: Illegal state: leader elections are disabled
W20250902 21:32:35.081184 7843 raft_consensus.cc:668] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: failed to trigger leader election: Illegal state: leader elections are disabled
W20250902 21:32:35.093518 7844 raft_consensus.cc:668] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: failed to trigger leader election: Illegal state: leader elections are disabled
I20250902 21:32:35.846065 7398 tablet_service.cc:1423] Tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 set to quiescing
I20250902 21:32:35.846119 7398 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:35.879251 7520 tablet_service.cc:1423] Tablet server f79e0a34bf4d4181aed51bd155010a25 set to quiescing
I20250902 21:32:35.879307 7520 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:35.934648 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 7067
I20250902 21:32:35.944223 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.194:33503
--local_ip_for_outbound_sockets=127.4.52.194
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=37969
--webserver_interface=127.4.52.194
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:36.018687 7886 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:36.018882 7886 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:36.018913 7886 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:36.020300 7886 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:36.020360 7886 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.194
I20250902 21:32:36.021947 7886 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.194:33503
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.4.52.194
--webserver_port=37969
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.7886
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.194
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:36.022167 7886 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:36.022377 7886 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:36.025048 7891 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:36.025055 7892 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:36.025120 7894 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:36.025261 7886 server_base.cc:1047] running on GCE node
I20250902 21:32:36.025442 7886 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:36.025637 7886 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:36.026798 7886 hybrid_clock.cc:648] HybridClock initialized: now 1756848756026782 us; error 32 us; skew 500 ppm
I20250902 21:32:36.027885 7886 webserver.cc:480] Webserver started at http://127.4.52.194:37969/ using document root <none> and password file <none>
I20250902 21:32:36.028090 7886 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:36.028152 7886 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:36.029321 7886 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:36.029964 7900 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:36.030148 7886 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:36.030216 7886 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
uuid: "b41910bc980f4cfdbcc6eb23e1084325"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:36.030462 7886 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:36.042299 7886 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:36.042529 7886 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:36.042640 7886 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:36.042865 7886 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:36.043305 7907 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:36.044077 7886 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:36.044118 7886 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:36.044154 7886 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:36.044677 7886 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:36.044709 7886 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:36.044755 7907 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap starting.
I20250902 21:32:36.050325 7886 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.194:33503
I20250902 21:32:36.050384 8014 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.194:33503 every 8 connection(s)
I20250902 21:32:36.050652 7886 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
I20250902 21:32:36.055022 8015 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:36.055114 8015 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:36.055312 8015 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:36.055758 5497 ts_manager.cc:194] Re-registered known tserver with Master: b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:36.056113 5497 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.194:41455
I20250902 21:32:36.058233 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 7886
I20250902 21:32:36.058359 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 7202
I20250902 21:32:36.065270 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.193:34353
--local_ip_for_outbound_sockets=127.4.52.193
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=43863
--webserver_interface=127.4.52.193
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:36.130705 7907 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Log is configured to *not* fsync() on all Append() calls
W20250902 21:32:36.152227 8019 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:36.152370 8019 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:36.152400 8019 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:36.153680 8019 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:36.153738 8019 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.193
I20250902 21:32:36.155102 8019 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.193:34353
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.4.52.193
--webserver_port=43863
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.8019
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.193
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:36.155323 8019 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:36.155530 8019 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:36.157914 8025 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:36.157940 8028 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:36.157940 8026 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:36.158149 8019 server_base.cc:1047] running on GCE node
I20250902 21:32:36.158285 8019 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:36.158453 8019 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:36.159571 8019 hybrid_clock.cc:648] HybridClock initialized: now 1756848756159562 us; error 28 us; skew 500 ppm
I20250902 21:32:36.160565 8019 webserver.cc:480] Webserver started at http://127.4.52.193:43863/ using document root <none> and password file <none>
I20250902 21:32:36.160739 8019 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:36.160778 8019 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:36.161832 8019 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:36.162330 8034 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:36.162482 8019 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:36.162546 8019 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
uuid: "6311ff5fc63a49108dc3000117399229"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:36.162781 8019 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:36.169772 8019 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:36.169965 8019 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:36.170058 8019 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:36.170222 8019 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:36.170466 8019 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:36.170496 8019 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:36.170526 8019 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:36.170552 8019 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:36.176121 8019 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.193:34353
I20250902 21:32:36.176170 8147 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.193:34353 every 8 connection(s)
I20250902 21:32:36.176446 8019 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
I20250902 21:32:36.180234 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 8019
I20250902 21:32:36.180363 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 7334
I20250902 21:32:36.182332 8148 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:36.182401 8148 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:36.182559 8148 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:36.186349 5497 ts_manager.cc:194] Re-registered known tserver with Master: 6311ff5fc63a49108dc3000117399229 (127.4.52.193:34353)
I20250902 21:32:36.186733 5497 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.193:39385
W20250902 21:32:36.195190 6094 meta_cache.cc:302] tablet 1bd11ec67527495b831ab711f8a2f39b: replica 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541) has failed: Network error: recv got EOF from 127.4.52.196:35541 (error 108)
I20250902 21:32:36.195963 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.196:35541
--local_ip_for_outbound_sockets=127.4.52.196
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=36885
--webserver_interface=127.4.52.196
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:36.197907 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.198802 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.201229 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.218097 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.219090 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.220191 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.243727 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.244753 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.247822 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.281780 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.282847 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.284924 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.304442 8151 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:36.304641 8151 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:36.304675 8151 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:36.306692 8151 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:36.306811 8151 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.196
I20250902 21:32:36.309093 8151 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.196:35541
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.4.52.196
--webserver_port=36885
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.8151
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.196
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:36.309351 8151 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:36.309636 8151 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:36.312342 8158 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:36.312355 8159 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:36.312750 8151 server_base.cc:1047] running on GCE node
W20250902 21:32:36.314862 8161 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:36.315068 8151 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:36.315330 8151 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:36.316466 8151 hybrid_clock.cc:648] HybridClock initialized: now 1756848756316449 us; error 29 us; skew 500 ppm
I20250902 21:32:36.317543 8151 webserver.cc:480] Webserver started at http://127.4.52.196:36885/ using document root <none> and password file <none>
I20250902 21:32:36.317734 8151 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:36.317771 8151 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:36.318825 8151 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:36.319425 8167 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:36.319583 8151 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:36.319646 8151 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:36.319877 8151 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20250902 21:32:36.324690 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.328722 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:36.330873 7494 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:54266: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:36.337389 8151 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:36.337605 8151 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:36.337702 8151 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:36.337934 8151 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:36.338316 8174 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:36.339361 8151 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:36.339402 8151 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:36.339447 8151 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:36.339939 8151 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:36.339967 8151 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:36.340042 8174 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap starting.
I20250902 21:32:36.346510 8151 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.196:35541
I20250902 21:32:36.346871 8151 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
I20250902 21:32:36.347146 8281 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.196:35541 every 8 connection(s)
I20250902 21:32:36.351400 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 8151
I20250902 21:32:36.351498 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 7467
I20250902 21:32:36.362039 8282 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:36.362143 8282 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:36.362334 8282 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:36.362910 5495 ts_manager.cc:194] Re-registered known tserver with Master: 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541)
I20250902 21:32:36.363469 5495 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.196:51771
I20250902 21:32:36.366614 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.195:46587
--local_ip_for_outbound_sockets=127.4.52.195
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=32789
--webserver_interface=127.4.52.195
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:36.445741 8174 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Log is configured to *not* fsync() on all Append() calls
W20250902 21:32:36.474009 8286 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:36.474295 8286 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:36.474345 8286 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:36.476527 8286 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:36.476601 8286 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.195
I20250902 21:32:36.478842 8286 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.195:46587
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.4.52.195
--webserver_port=32789
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.8286
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.195
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:36.479091 8286 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:36.479334 8286 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:36.481940 8293 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:36.482014 8295 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:36.482168 8292 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:36.482219 8286 server_base.cc:1047] running on GCE node
I20250902 21:32:36.482420 8286 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:36.482605 8286 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:36.483745 8286 hybrid_clock.cc:648] HybridClock initialized: now 1756848756483716 us; error 46 us; skew 500 ppm
I20250902 21:32:36.485096 8286 webserver.cc:480] Webserver started at http://127.4.52.195:32789/ using document root <none> and password file <none>
I20250902 21:32:36.485308 8286 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:36.485375 8286 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:36.486827 8286 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:36.487614 8301 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:36.487790 8286 fs_manager.cc:730] Time spent opening block manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:36.487869 8286 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
uuid: "f79e0a34bf4d4181aed51bd155010a25"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:36.488160 8286 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:36.498312 8286 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:36.498538 8286 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:36.498642 8286 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:36.498961 8286 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:36.499461 8308 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:36.500429 8286 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:36.500483 8286 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:36.500510 8286 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:36.501183 8286 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:36.501217 8286 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:36.507797 8286 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.195:46587
I20250902 21:32:36.508159 8286 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
I20250902 21:32:36.508518 8308 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap starting.
I20250902 21:32:36.513334 8416 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:36.513476 8416 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:36.513697 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 8286
I20250902 21:32:36.515694 8416 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:36.516227 5496 ts_manager.cc:194] Re-registered known tserver with Master: f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:36.516731 5496 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.195:37961
I20250902 21:32:36.517169 8415 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.195:46587 every 8 connection(s)
I20250902 21:32:36.625669 8308 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:36.675034 8216 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:36.679414 7949 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:36.685099 8350 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:36.695766 8082 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:37.056928 8015 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:37.180913 7907 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 1/3 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:37.187587 8148 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:37.249011 8174 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 1/3 log segments. Stats: ops{read=4623 overwritten=0 applied=4620 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:37.364454 8282 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:37.517549 8416 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:37.761049 8308 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 1/3 log segments. Stats: ops{read=4623 overwritten=0 applied=4620 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:37.982549 8174 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 2/3 log segments. Stats: ops{read=9245 overwritten=0 applied=9243 ignored=0} inserts{seen=461900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:38.378455 7907 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 2/3 log segments. Stats: ops{read=9271 overwritten=0 applied=9268 ignored=0} inserts{seen=463150 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:38.610136 8174 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 3/3 log segments. Stats: ops{read=13094 overwritten=0 applied=13091 ignored=0} inserts{seen=654300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:38.610569 8174 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap complete.
I20250902 21:32:38.615507 8174 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent bootstrapping tablet: real 2.276s user 1.916s sys 0.339s
I20250902 21:32:38.616897 8174 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:38.617550 8174 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Initialized, Role: FOLLOWER
I20250902 21:32:38.617741 8174 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 13091, Last appended: 6.13094, Last appended by leader: 13094, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:38.618062 8174 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent starting tablet: real 0.002s user 0.003s sys 0.000s
I20250902 21:32:38.812876 8308 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 2/3 log segments. Stats: ops{read=9261 overwritten=0 applied=9259 ignored=0} inserts{seen=462700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
W20250902 21:32:38.896549 6121 scanner-internal.cc:458] Time spent opening tablet: real 3.806s user 0.001s sys 0.000s
I20250902 21:32:38.924968 8457 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:38.925578 8457 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:38.926462 8457 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 8 pre-election: Requested pre-vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:38.942580 8361 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 8 candidate_status { last_received { term: 6 index: 13094 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
W20250902 21:32:38.943755 8168 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:38.948734 7968 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 8 candidate_status { last_received { term: 6 index: 13094 } } ignore_live_leader: false dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325" is_pre_election: true
W20250902 21:32:38.949700 8168 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:38.949805 8168 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 8 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5; no voters: b41910bc980f4cfdbcc6eb23e1084325, f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:38.949939 8457 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Leader pre-election lost for term 8. Reason: could not achieve majority
I20250902 21:32:39.288036 7907 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 3/3 log segments. Stats: ops{read=13091 overwritten=0 applied=13089 ignored=0} inserts{seen=654200 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:39.288627 7907 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap complete.
I20250902 21:32:39.294865 7907 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent bootstrapping tablet: real 3.250s user 2.886s sys 0.351s
I20250902 21:32:39.295532 7907 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 7 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:39.296321 7907 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 7 FOLLOWER]: Becoming Follower/Learner. State: Replica: b41910bc980f4cfdbcc6eb23e1084325, State: Initialized, Role: FOLLOWER
I20250902 21:32:39.296500 7907 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 13089, Last appended: 6.13091, Last appended by leader: 13091, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:39.296715 7907 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent starting tablet: real 0.002s user 0.001s sys 0.000s
W20250902 21:32:39.372824 6122 scanner-internal.cc:458] Time spent opening tablet: real 4.007s user 0.001s sys 0.000s
W20250902 21:32:39.382133 6123 scanner-internal.cc:458] Time spent opening tablet: real 4.005s user 0.001s sys 0.001s
I20250902 21:32:39.393730 8457 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:39.393837 8457 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:39.393991 8457 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 8 pre-election: Requested pre-vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:39.394138 8361 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 8 candidate_status { last_received { term: 6 index: 13094 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
W20250902 21:32:39.394309 8168 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:39.394848 7968 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 8 candidate_status { last_received { term: 6 index: 13094 } } ignore_live_leader: false dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325" is_pre_election: true
I20250902 21:32:39.394963 7968 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 7 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 7.
I20250902 21:32:39.395159 8168 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 8 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, b41910bc980f4cfdbcc6eb23e1084325; no voters: f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:39.395376 8457 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Leader pre-election won for term 8
I20250902 21:32:39.395434 8457 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:39.395459 8457 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 7 FOLLOWER]: Advancing to term 8
I20250902 21:32:39.396320 8457 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 8 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:39.396409 8457 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 8 election: Requested vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:39.396618 8361 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 8 candidate_status { last_received { term: 6 index: 13094 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
W20250902 21:32:39.396884 8168 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 8 election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:39.397140 7968 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 8 candidate_status { last_received { term: 6 index: 13094 } } ignore_live_leader: false dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
I20250902 21:32:39.397229 7968 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 7 FOLLOWER]: Advancing to term 8
I20250902 21:32:39.397994 7968 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 8 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 3a35f7f28cb9438dbcfb3196e167fdc5 in term 8.
I20250902 21:32:39.398168 8168 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 8 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, b41910bc980f4cfdbcc6eb23e1084325; no voters: f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:39.398308 8457 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 8 FOLLOWER]: Leader election won for term 8
I20250902 21:32:39.398424 8457 raft_consensus.cc:695] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 8 LEADER]: Becoming Leader. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Running, Role: LEADER
I20250902 21:32:39.398491 8457 consensus_queue.cc:237] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 13091, Committed index: 13091, Last appended: 6.13094, Last appended by leader: 13094, Current term: 8, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:39.399183 5496 catalog_manager.cc:5582] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 reported cstate change: term changed from 6 to 8. New cstate: current_term: 8 leader_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:39.502417 7968 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 8 FOLLOWER]: Refusing update from remote peer 3a35f7f28cb9438dbcfb3196e167fdc5: Log matching property violated. Preceding OpId in replica: term: 6 index: 13091. Preceding OpId from leader: term: 8 index: 13095. (index mismatch)
I20250902 21:32:39.503209 8467 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 13095, Last known committed idx: 13089, Time since last communication: 0.000s
W20250902 21:32:39.506444 8168 consensus_peers.cc:489] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 -> Peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Couldn't send request to peer f79e0a34bf4d4181aed51bd155010a25. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 1: this message will repeat every 5th retry.
I20250902 21:32:39.741276 8308 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 3/3 log segments. Stats: ops{read=13089 overwritten=0 applied=13089 ignored=0} inserts{seen=654200 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250902 21:32:39.741881 8308 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap complete.
I20250902 21:32:39.747810 8308 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent bootstrapping tablet: real 3.239s user 2.763s sys 0.431s
I20250902 21:32:39.748772 8308 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 7 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:39.749094 8308 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 7 FOLLOWER]: Becoming Follower/Learner. State: Replica: f79e0a34bf4d4181aed51bd155010a25, State: Initialized, Role: FOLLOWER
I20250902 21:32:39.749248 8308 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 13089, Last appended: 6.13089, Last appended by leader: 13089, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:39.749573 8308 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent starting tablet: real 0.002s user 0.002s sys 0.000s
I20250902 21:32:39.816097 8361 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 7 FOLLOWER]: Advancing to term 8
I20250902 21:32:39.817443 8361 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 8 FOLLOWER]: Refusing update from remote peer 3a35f7f28cb9438dbcfb3196e167fdc5: Log matching property violated. Preceding OpId in replica: term: 6 index: 13089. Preceding OpId from leader: term: 6 index: 13094. (index mismatch)
I20250902 21:32:39.818629 8483 consensus_queue.cc:1037] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Got LMP mismatch error from peer: Peer: permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 13095, Last known committed idx: 13089, Time since last communication: 0.000s
I20250902 21:32:39.869259 8485 mvcc.cc:204] Tried to move back new op lower bound from 7196052520199479296 to 7196052518496407552. Current Snapshot: MvccSnapshot[applied={T|T < 7196052520158007296}]
I20250902 21:32:42.016758 7949 tablet_service.cc:1430] Tablet server has 0 leaders and 2 scanners
I20250902 21:32:42.022421 8216 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250902 21:32:42.024402 8350 tablet_service.cc:1430] Tablet server has 0 leaders and 1 scanners
I20250902 21:32:42.047596 8082 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:42.373571 5496 ts_manager.cc:284] Unset tserver state for 3a35f7f28cb9438dbcfb3196e167fdc5 from MAINTENANCE_MODE
I20250902 21:32:42.378264 5497 ts_manager.cc:284] Unset tserver state for 6311ff5fc63a49108dc3000117399229 from MAINTENANCE_MODE
I20250902 21:32:42.416512 5497 ts_manager.cc:284] Unset tserver state for f79e0a34bf4d4181aed51bd155010a25 from MAINTENANCE_MODE
I20250902 21:32:42.478390 5497 ts_manager.cc:284] Unset tserver state for b41910bc980f4cfdbcc6eb23e1084325 from MAINTENANCE_MODE
I20250902 21:32:42.519316 8015 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:42.746261 5497 ts_manager.cc:295] Set tserver state for b41910bc980f4cfdbcc6eb23e1084325 to MAINTENANCE_MODE
I20250902 21:32:42.808544 5497 ts_manager.cc:295] Set tserver state for f79e0a34bf4d4181aed51bd155010a25 to MAINTENANCE_MODE
I20250902 21:32:42.830690 8416 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:42.842454 8282 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:42.849025 5497 ts_manager.cc:295] Set tserver state for 3a35f7f28cb9438dbcfb3196e167fdc5 to MAINTENANCE_MODE
I20250902 21:32:42.867175 5497 ts_manager.cc:295] Set tserver state for 6311ff5fc63a49108dc3000117399229 to MAINTENANCE_MODE
I20250902 21:32:43.012571 7949 tablet_service.cc:1423] Tablet server b41910bc980f4cfdbcc6eb23e1084325 set to quiescing
I20250902 21:32:43.012634 7949 tablet_service.cc:1430] Tablet server has 0 leaders and 2 scanners
I20250902 21:32:43.136991 8216 tablet_service.cc:1423] Tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 set to quiescing
I20250902 21:32:43.137046 8216 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250902 21:32:43.138664 8483 raft_consensus.cc:991] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: : Instructing follower f79e0a34bf4d4181aed51bd155010a25 to start an election
I20250902 21:32:43.138732 8483 raft_consensus.cc:1079] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 8 LEADER]: Signalling peer f79e0a34bf4d4181aed51bd155010a25 to start an election
I20250902 21:32:43.139339 8361 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b"
dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
from {username='slave'} at 127.4.52.196:49683
I20250902 21:32:43.139436 8361 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 8 FOLLOWER]: Starting forced leader election (received explicit request)
I20250902 21:32:43.139489 8361 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 8 FOLLOWER]: Advancing to term 9
I20250902 21:32:43.140367 8361 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 9 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:43.140592 8361 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [CANDIDATE]: Term 9 election: Requested vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:43.142233 8361 raft_consensus.cc:1238] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 9 FOLLOWER]: Rejecting Update request from peer 3a35f7f28cb9438dbcfb3196e167fdc5 for earlier term 8. Current term is 9. Ops: [8.16305-8.16306]
I20250902 21:32:43.143950 8517 consensus_queue.cc:1046] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 }, Status: INVALID_TERM, Last received: 8.16304, Next index: 16305, Last known committed idx: 16303, Time since last communication: 0.000s
I20250902 21:32:43.144239 8467 raft_consensus.cc:3053] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 8 LEADER]: Stepping down as leader of term 8
I20250902 21:32:43.144285 8467 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 8 LEADER]: Becoming Follower/Learner. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Running, Role: LEADER
I20250902 21:32:43.144342 8467 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 16307, Committed index: 16307, Last appended: 8.16307, Last appended by leader: 16307, Current term: 8, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:43.145995 8467 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 8 FOLLOWER]: Advancing to term 9
W20250902 21:32:43.150553 8196 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.150701 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.150556 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:43.152099 7968 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "f79e0a34bf4d4181aed51bd155010a25" candidate_term: 9 candidate_status { last_received { term: 8 index: 16304 } } ignore_live_leader: true dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
I20250902 21:32:43.152177 7968 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 8 FOLLOWER]: Advancing to term 9
I20250902 21:32:43.152940 7968 raft_consensus.cc:2408] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate f79e0a34bf4d4181aed51bd155010a25 for term 9 because replica has last-logged OpId of term: 8 index: 16307, which is greater than that of the candidate, which has last-logged OpId of term: 8 index: 16304.
W20250902 21:32:43.154093 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.154606 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:43.154839 8236 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "f79e0a34bf4d4181aed51bd155010a25" candidate_term: 9 candidate_status { last_received { term: 8 index: 16304 } } ignore_live_leader: true dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
I20250902 21:32:43.154949 8236 raft_consensus.cc:2408] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate f79e0a34bf4d4181aed51bd155010a25 for term 9 because replica has last-logged OpId of term: 8 index: 16307, which is greater than that of the candidate, which has last-logged OpId of term: 8 index: 16304.
I20250902 21:32:43.155092 8304 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [CANDIDATE]: Term 9 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: f79e0a34bf4d4181aed51bd155010a25; no voters: 3a35f7f28cb9438dbcfb3196e167fdc5, b41910bc980f4cfdbcc6eb23e1084325
I20250902 21:32:43.155287 8667 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 9 FOLLOWER]: Leader election lost for term 9. Reason: could not achieve majority
W20250902 21:32:43.156595 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.157584 7928 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.158591 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.159479 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:43.159513 8350 tablet_service.cc:1423] Tablet server f79e0a34bf4d4181aed51bd155010a25 set to quiescing
I20250902 21:32:43.159543 8350 tablet_service.cc:1430] Tablet server has 0 leaders and 1 scanners
W20250902 21:32:43.162431 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.163391 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.166244 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.170698 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.170723 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.174913 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.177707 7927 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.179170 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:43.180253 8082 tablet_service.cc:1423] Tablet server 6311ff5fc63a49108dc3000117399229 set to quiescing
I20250902 21:32:43.180306 8082 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
W20250902 21:32:43.181286 7927 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.185983 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.188786 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.189924 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:43.191571 8148 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
W20250902 21:32:43.194407 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.197381 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.200502 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.204902 7927 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.207002 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.209976 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.216414 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.218737 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.223822 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.230376 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.231263 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.237507 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.243805 7928 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.244668 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.253062 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.257761 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.261423 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.267733 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.273303 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.277253 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.283318 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.288780 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.293730 7928 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.297802 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.305840 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.312486 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.313663 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.327126 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.330219 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.333267 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.348570 7928 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.351773 7928 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.351950 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.371577 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.371855 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.374826 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.391695 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.392581 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.395627 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.412397 7928 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.416158 7926 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.416251 7928 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.434449 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.437996 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.442008 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.444765 8678 raft_consensus.cc:668] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: failed to trigger leader election: Illegal state: leader elections are disabled
W20250902 21:32:43.449656 8467 raft_consensus.cc:668] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: failed to trigger leader election: Illegal state: leader elections are disabled
W20250902 21:32:43.458484 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.461576 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.467705 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.485011 7926 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.488025 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.493155 7928 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.511871 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.516839 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.522130 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.533879 8667 raft_consensus.cc:668] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: failed to trigger leader election: Illegal state: leader elections are disabled
W20250902 21:32:43.540609 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.544667 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.550827 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.567101 7928 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.573176 7928 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.581329 7926 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.595083 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.601948 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.613088 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.627581 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.630673 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.643719 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.661082 7926 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.661204 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.675135 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.692049 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.695900 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.706892 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.724373 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.731506 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.739478 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.757860 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.765048 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.771871 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.795717 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.799834 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.805667 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.833218 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.838330 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.841382 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.869635 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.877751 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.880769 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.907601 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.918722 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.918722 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.946285 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.960307 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.960330 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:43.986838 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.002933 7926 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.003014 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.030911 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.045799 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.046015 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.073580 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.086651 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.090670 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.115098 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.130216 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.135068 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.162710 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.176093 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.178862 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.209483 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.223711 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.225836 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:44.228259 7949 tablet_service.cc:1423] Tablet server b41910bc980f4cfdbcc6eb23e1084325 set to quiescing
I20250902 21:32:44.228307 7949 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
W20250902 21:32:44.260154 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.270282 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:44.271999 8216 tablet_service.cc:1423] Tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 set to quiescing
I20250902 21:32:44.272051 8216 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
W20250902 21:32:44.274255 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.306298 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.316113 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:44.326129 8350 tablet_service.cc:1423] Tablet server f79e0a34bf4d4181aed51bd155010a25 set to quiescing
I20250902 21:32:44.326174 8350 tablet_service.cc:1430] Tablet server has 0 leaders and 1 scanners
W20250902 21:32:44.327262 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.354744 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.364898 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.376938 8329 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.404202 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.412623 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.425235 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.456439 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.464425 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.475590 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.510360 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.515396 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.525609 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.563397 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.565872 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.578438 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.616787 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.618455 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.630818 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.671558 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.672441 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.683578 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.725495 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.730186 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.740638 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.783651 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.785779 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.798025 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.842716 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.843588 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.856863 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.903793 7926 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.903798 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.915467 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.964038 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.964038 8178 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:44.976049 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.022837 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.024828 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.039065 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.086772 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.089465 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.101119 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.149298 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.155227 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.162238 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.214917 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.219028 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.224118 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.277819 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.283325 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.287820 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.343956 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.349960 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.355094 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.410111 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.417049 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.420197 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:45.460368 8350 tablet_service.cc:1423] Tablet server f79e0a34bf4d4181aed51bd155010a25 set to quiescing
I20250902 21:32:45.460420 8350 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
W20250902 21:32:45.479795 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.485435 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.487907 7929 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48066: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:45.515789 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 7886
I20250902 21:32:45.526644 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.194:33503
--local_ip_for_outbound_sockets=127.4.52.194
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=37969
--webserver_interface=127.4.52.194
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:45.551363 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.554253 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.557317 8195 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:59948: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.596788 8723 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:45.596922 8723 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:45.596957 8723 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:45.598245 8723 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:45.598294 8723 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.194
I20250902 21:32:45.599642 8723 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.194:33503
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.4.52.194
--webserver_port=37969
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.8723
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.194
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:45.599822 8723 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:45.600047 8723 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:45.602174 8728 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:45.602221 8731 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:45.602221 8729 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:45.602352 8723 server_base.cc:1047] running on GCE node
I20250902 21:32:45.602547 8723 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:45.602730 8723 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:45.603881 8723 hybrid_clock.cc:648] HybridClock initialized: now 1756848765603864 us; error 31 us; skew 500 ppm
I20250902 21:32:45.604864 8723 webserver.cc:480] Webserver started at http://127.4.52.194:37969/ using document root <none> and password file <none>
I20250902 21:32:45.605042 8723 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:45.605088 8723 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:45.606148 8723 fs_manager.cc:714] Time spent opening directory manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:45.606681 8737 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:45.606828 8723 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:45.606890 8723 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
uuid: "b41910bc980f4cfdbcc6eb23e1084325"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:45.607082 8723 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:45.617599 8723 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:45.617762 8723 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:45.617834 8723 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:45.617991 8723 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:45.618288 8744 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:45.619009 8723 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:45.619046 8723 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:45.619067 8723 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:45.619547 8723 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:45.619577 8723 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:45.619632 8744 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap starting.
W20250902 21:32:45.620189 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:45.624883 8723 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.194:33503
I20250902 21:32:45.624966 8851 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.194:33503 every 8 connection(s)
I20250902 21:32:45.625223 8723 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
W20250902 21:32:45.626191 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.629344 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:45.630482 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 8723
I20250902 21:32:45.630584 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 8019
I20250902 21:32:45.633777 8852 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:45.633888 8852 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:45.634063 8852 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:45.634495 5497 ts_manager.cc:194] Re-registered known tserver with Master: b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:45.634964 5497 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.194:47843
I20250902 21:32:45.638067 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.193:34353
--local_ip_for_outbound_sockets=127.4.52.193
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=43863
--webserver_interface=127.4.52.193
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:45.667212 8744 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Log is configured to *not* fsync() on all Append() calls
W20250902 21:32:45.715785 8856 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:45.715938 8856 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:45.715965 8856 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:45.717180 8856 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:45.717231 8856 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.193
I20250902 21:32:45.718616 8856 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.193:34353
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.4.52.193
--webserver_port=43863
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.8856
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.193
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:45.718838 8856 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:45.719033 8856 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:45.721318 8867 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:45.721314 8865 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:45.721446 8864 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:45.721688 8856 server_base.cc:1047] running on GCE node
I20250902 21:32:45.721828 8856 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:45.722018 8856 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:45.723168 8856 hybrid_clock.cc:648] HybridClock initialized: now 1756848765723153 us; error 31 us; skew 500 ppm
I20250902 21:32:45.724260 8856 webserver.cc:480] Webserver started at http://127.4.52.193:43863/ using document root <none> and password file <none>
I20250902 21:32:45.724447 8856 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:45.724493 8856 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:45.725580 8856 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:45.726074 8873 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:45.726235 8856 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:45.726297 8856 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
uuid: "6311ff5fc63a49108dc3000117399229"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:45.726500 8856 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:45.739022 8856 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:45.739225 8856 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:45.739327 8856 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:45.739513 8856 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:45.739795 8856 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:45.739835 8856 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:45.739862 8856 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:45.739889 8856 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:45.746145 8856 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.193:34353
I20250902 21:32:45.746210 8986 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.193:34353 every 8 connection(s)
I20250902 21:32:45.746446 8856 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
I20250902 21:32:45.750857 8987 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:45.750973 8987 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:45.751199 8987 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:45.751552 5497 ts_manager.cc:194] Re-registered known tserver with Master: 6311ff5fc63a49108dc3000117399229 (127.4.52.193:34353)
I20250902 21:32:45.751883 5497 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.193:52233
I20250902 21:32:45.754567 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 8856
I20250902 21:32:45.754673 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 8151
W20250902 21:32:45.770232 6094 connection.cc:537] client connection to 127.4.52.196:35541 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250902 21:32:45.770308 6094 meta_cache.cc:302] tablet 1bd11ec67527495b831ab711f8a2f39b: replica 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541) has failed: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20250902 21:32:45.770392 6121 meta_cache.cc:1510] marking tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541) as failed
I20250902 21:32:45.770673 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.196:35541
--local_ip_for_outbound_sockets=127.4.52.196
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=36885
--webserver_interface=127.4.52.196
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:45.841567 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.842092 8990 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:45.842212 8990 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:45.842229 8990 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:45.843482 8990 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:45.843525 8990 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.196
W20250902 21:32:45.844666 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
I20250902 21:32:45.845082 8990 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.196:35541
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.4.52.196
--webserver_port=36885
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.8990
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.196
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:45.845283 8990 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:45.845479 8990 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:45.845747 8312 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51554: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:45.847798 8995 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:45.847792 8998 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:45.847932 8990 server_base.cc:1047] running on GCE node
W20250902 21:32:45.848055 8996 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:45.848219 8990 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:45.848405 8990 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:45.849526 8990 hybrid_clock.cc:648] HybridClock initialized: now 1756848765849517 us; error 27 us; skew 500 ppm
I20250902 21:32:45.850431 8990 webserver.cc:480] Webserver started at http://127.4.52.196:36885/ using document root <none> and password file <none>
I20250902 21:32:45.850589 8990 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:45.850629 8990 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:45.851785 8990 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:45.852414 9004 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:45.852560 8990 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:45.852619 8990 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:45.852814 8990 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:45.871002 8990 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:45.871227 8990 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:45.871326 8990 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:45.871493 8990 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:45.871825 9011 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:45.872773 8990 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:45.872809 8990 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:45.872851 8990 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:45.873318 8990 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:45.873346 8990 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:45.873420 9011 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap starting.
I20250902 21:32:45.879787 8990 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.196:35541
I20250902 21:32:45.880069 8990 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
I20250902 21:32:45.882916 9118 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.196:35541 every 8 connection(s)
I20250902 21:32:45.886040 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 8990
I20250902 21:32:45.886137 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 8286
I20250902 21:32:45.892513 9119 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:45.892621 9119 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:45.892822 9119 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:45.893362 5497 ts_manager.cc:194] Re-registered known tserver with Master: 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541)
I20250902 21:32:45.893808 5497 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.196:53225
I20250902 21:32:45.899782 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.195:46587
--local_ip_for_outbound_sockets=127.4.52.195
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=32789
--webserver_interface=127.4.52.195
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20250902 21:32:45.943040 9011 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Log is configured to *not* fsync() on all Append() calls
W20250902 21:32:45.970585 9122 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:45.970716 9122 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:45.970743 9122 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:45.972038 9122 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:45.972091 9122 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.195
I20250902 21:32:45.973399 9122 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.195:46587
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.4.52.195
--webserver_port=32789
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.9122
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.195
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:45.973572 9122 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:45.973758 9122 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:45.976116 9128 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:45.976318 9122 server_base.cc:1047] running on GCE node
W20250902 21:32:45.976162 9131 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:45.976116 9129 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:45.976606 9122 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:45.976783 9122 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:45.977905 9122 hybrid_clock.cc:648] HybridClock initialized: now 1756848765977898 us; error 27 us; skew 500 ppm
I20250902 21:32:45.978864 9122 webserver.cc:480] Webserver started at http://127.4.52.195:32789/ using document root <none> and password file <none>
I20250902 21:32:45.979035 9122 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:45.979072 9122 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:45.980036 9122 fs_manager.cc:714] Time spent opening directory manager: real 0.000s user 0.000s sys 0.001s
I20250902 21:32:45.980571 9137 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:45.980712 9122 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20250902 21:32:45.980769 9122 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
uuid: "f79e0a34bf4d4181aed51bd155010a25"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:45.980964 9122 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:45.998061 9122 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:45.998250 9122 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:45.998342 9122 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:45.998499 9122 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:46.002952 9145 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:46.003691 9122 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:46.003731 9122 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.005s user 0.000s sys 0.000s
I20250902 21:32:46.003769 9122 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:46.004274 9122 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:46.004343 9122 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:46.004379 9145 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap starting.
I20250902 21:32:46.010303 9122 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.195:46587
I20250902 21:32:46.010623 9122 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
I20250902 21:32:46.013780 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 9122
I20250902 21:32:46.017170 9252 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.195:46587 every 8 connection(s)
I20250902 21:32:46.029942 9253 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:46.030021 9253 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:46.030196 9253 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:46.030674 5497 ts_manager.cc:194] Re-registered known tserver with Master: f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:46.031179 5497 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.195:59471
I20250902 21:32:46.073374 9145 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:46.174566 9043 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:46.180258 8786 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:46.184077 9180 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:46.198699 8921 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:46.635717 8852 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:46.669147 8744 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:46.752700 8987 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:46.894646 9119 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:46.978919 9145 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4620 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:47.031917 9253 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:47.089190 9011 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4620 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:47.395602 8744 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 2/4 log segments. Stats: ops{read=9246 overwritten=0 applied=9243 ignored=0} inserts{seen=461900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:48.176388 8744 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 3/4 log segments. Stats: ops{read=13914 overwritten=0 applied=13912 ignored=0} inserts{seen=695300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:48.176697 9145 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 2/4 log segments. Stats: ops{read=9245 overwritten=0 applied=9242 ignored=0} inserts{seen=461850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:48.284873 9011 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 2/4 log segments. Stats: ops{read=9245 overwritten=0 applied=9242 ignored=0} inserts{seen=461850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:48.570183 8744 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 4/4 log segments. Stats: ops{read=16307 overwritten=0 applied=16307 ignored=0} inserts{seen=815050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250902 21:32:48.570667 8744 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap complete.
I20250902 21:32:48.576918 8744 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent bootstrapping tablet: real 2.957s user 2.623s sys 0.326s
I20250902 21:32:48.577764 8744 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:48.577948 8744 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Becoming Follower/Learner. State: Replica: b41910bc980f4cfdbcc6eb23e1084325, State: Initialized, Role: FOLLOWER
I20250902 21:32:48.578052 8744 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 16307, Last appended: 8.16307, Last appended by leader: 16307, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:48.578280 8744 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent starting tablet: real 0.001s user 0.003s sys 0.000s
W20250902 21:32:48.609225 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:48.614923 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:48.618065 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:48.850714 9295 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:48.850865 9295 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:48.851490 9295 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 pre-election: Requested pre-vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:48.855031 9073 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 10 candidate_status { last_received { term: 8 index: 16307 } } ignore_live_leader: false dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" is_pre_election: true
I20250902 21:32:48.855026 9207 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 10 candidate_status { last_received { term: 8 index: 16307 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
W20250902 21:32:48.855964 8740 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 pre-election: Tablet error from VoteRequest() call to peer 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541): Illegal state: must be running to vote when last-logged opid is not known
W20250902 21:32:48.856067 8738 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 pre-election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:48.856103 8738 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: b41910bc980f4cfdbcc6eb23e1084325; no voters: 3a35f7f28cb9438dbcfb3196e167fdc5, f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:48.856186 9295 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Leader pre-election lost for term 10. Reason: could not achieve majority
W20250902 21:32:48.926141 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:48.932749 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:48.933816 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:49.120136 9145 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 3/4 log segments. Stats: ops{read=13944 overwritten=0 applied=13942 ignored=0} inserts{seen=696800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:49.176338 9011 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 3/4 log segments. Stats: ops{read=13866 overwritten=0 applied=13864 ignored=0} inserts{seen=692900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
W20250902 21:32:49.252799 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:49.256325 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:49.257118 9295 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:49.257185 9295 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:49.257328 9295 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 pre-election: Requested pre-vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:49.257468 9073 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 10 candidate_status { last_received { term: 8 index: 16307 } } ignore_live_leader: false dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" is_pre_election: true
I20250902 21:32:49.257498 9207 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 10 candidate_status { last_received { term: 8 index: 16307 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
W20250902 21:32:49.257637 8740 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 pre-election: Tablet error from VoteRequest() call to peer 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541): Illegal state: must be running to vote when last-logged opid is not known
W20250902 21:32:49.257680 8738 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 pre-election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:49.257717 8740 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: b41910bc980f4cfdbcc6eb23e1084325; no voters: 3a35f7f28cb9438dbcfb3196e167fdc5, f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:49.258020 9295 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Leader pre-election lost for term 10. Reason: could not achieve majority
W20250902 21:32:49.262426 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:49.502385 9145 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 4/4 log segments. Stats: ops{read=16304 overwritten=0 applied=16303 ignored=0} inserts{seen=814850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20250902 21:32:49.502839 9145 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap complete.
I20250902 21:32:49.508545 9145 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent bootstrapping tablet: real 3.504s user 3.002s sys 0.470s
I20250902 21:32:49.508999 9145 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 9 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:49.509527 9145 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 9 FOLLOWER]: Becoming Follower/Learner. State: Replica: f79e0a34bf4d4181aed51bd155010a25, State: Initialized, Role: FOLLOWER
I20250902 21:32:49.509683 9145 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 16303, Last appended: 8.16304, Last appended by leader: 16304, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:49.509902 9145 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent starting tablet: real 0.001s user 0.003s sys 0.000s
I20250902 21:32:49.564376 9011 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 4/4 log segments. Stats: ops{read=16307 overwritten=0 applied=16307 ignored=0} inserts{seen=815050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250902 21:32:49.564811 9011 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap complete.
I20250902 21:32:49.570151 9011 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent bootstrapping tablet: real 3.697s user 3.172s sys 0.507s
I20250902 21:32:49.570547 9011 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 9 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:49.570700 9011 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 9 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Initialized, Role: FOLLOWER
I20250902 21:32:49.570794 9011 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 16307, Last appended: 8.16307, Last appended by leader: 16307, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:49.570994 9011 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent starting tablet: real 0.001s user 0.000s sys 0.004s
W20250902 21:32:49.590255 9165 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51606: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:49.590672 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:49.600163 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:49.638899 9295 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:49.638979 9295 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:49.639171 9295 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 pre-election: Requested pre-vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:49.639389 9073 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 10 candidate_status { last_received { term: 8 index: 16307 } } ignore_live_leader: false dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" is_pre_election: true
I20250902 21:32:49.639386 9207 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 10 candidate_status { last_received { term: 8 index: 16307 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
I20250902 21:32:49.639513 9073 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 9 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate b41910bc980f4cfdbcc6eb23e1084325 in term 9.
I20250902 21:32:49.639513 9207 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 9 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate b41910bc980f4cfdbcc6eb23e1084325 in term 9.
I20250902 21:32:49.639652 8738 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: b41910bc980f4cfdbcc6eb23e1084325, f79e0a34bf4d4181aed51bd155010a25; no voters:
I20250902 21:32:49.639752 9295 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Leader pre-election won for term 10
I20250902 21:32:49.639816 9295 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:49.639845 9295 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 9 FOLLOWER]: Advancing to term 10
I20250902 21:32:49.640689 9295 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 10 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:49.640796 9295 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 election: Requested vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:49.640933 9073 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 10 candidate_status { last_received { term: 8 index: 16307 } } ignore_live_leader: false dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
I20250902 21:32:49.640944 9207 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 10 candidate_status { last_received { term: 8 index: 16307 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
I20250902 21:32:49.640993 9073 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 9 FOLLOWER]: Advancing to term 10
I20250902 21:32:49.640996 9207 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 9 FOLLOWER]: Advancing to term 10
I20250902 21:32:49.641722 9207 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 10 FOLLOWER]: Leader election vote request: Granting yes vote for candidate b41910bc980f4cfdbcc6eb23e1084325 in term 10.
I20250902 21:32:49.641767 9073 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 10 FOLLOWER]: Leader election vote request: Granting yes vote for candidate b41910bc980f4cfdbcc6eb23e1084325 in term 10.
I20250902 21:32:49.641870 8738 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 10 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: b41910bc980f4cfdbcc6eb23e1084325, f79e0a34bf4d4181aed51bd155010a25; no voters:
I20250902 21:32:49.641960 9295 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 10 FOLLOWER]: Leader election won for term 10
I20250902 21:32:49.642083 9295 raft_consensus.cc:695] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 10 LEADER]: Becoming Leader. State: Replica: b41910bc980f4cfdbcc6eb23e1084325, State: Running, Role: LEADER
I20250902 21:32:49.642177 9295 consensus_queue.cc:237] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 16307, Committed index: 16307, Last appended: 8.16307, Last appended by leader: 16307, Current term: 10, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:49.642766 5497 catalog_manager.cc:5582] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 reported cstate change: term changed from 8 to 10, leader changed from 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196) to b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194). New cstate: current_term: 10 leader_uuid: "b41910bc980f4cfdbcc6eb23e1084325" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } health_report { overall_health: HEALTHY } } }
I20250902 21:32:49.703810 9207 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 10 FOLLOWER]: Refusing update from remote peer b41910bc980f4cfdbcc6eb23e1084325: Log matching property violated. Preceding OpId in replica: term: 8 index: 16304. Preceding OpId from leader: term: 10 index: 16309. (index mismatch)
I20250902 21:32:49.703810 9073 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 10 FOLLOWER]: Refusing update from remote peer b41910bc980f4cfdbcc6eb23e1084325: Log matching property violated. Preceding OpId in replica: term: 8 index: 16307. Preceding OpId from leader: term: 10 index: 16309. (index mismatch)
I20250902 21:32:49.704074 9305 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 16308, Last known committed idx: 16307, Time since last communication: 0.000s
I20250902 21:32:49.704241 9295 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 16308, Last known committed idx: 16303, Time since last communication: 0.000s
I20250902 21:32:49.705946 9311 mvcc.cc:204] Tried to move back new op lower bound from 7196052560705363968 to 7196052560454631424. Current Snapshot: MvccSnapshot[applied={T|T < 7196052560704483328}]
I20250902 21:32:49.707602 9314 mvcc.cc:204] Tried to move back new op lower bound from 7196052560705363968 to 7196052560454631424. Current Snapshot: MvccSnapshot[applied={T|T < 7196052560704483328}]
W20250902 21:32:49.713661 9165 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51606: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:49.715457 9165 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51606: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:49.715487 9167 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51606: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:49.718969 9033 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56996: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:49.719960 9033 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56996: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:49.743909 6123 scanner-internal.cc:458] Time spent opening tablet: real 6.009s user 0.001s sys 0.001s
W20250902 21:32:49.770525 6122 scanner-internal.cc:458] Time spent opening tablet: real 6.007s user 0.000s sys 0.001s
W20250902 21:32:50.575426 6121 scanner-internal.cc:458] Time spent opening tablet: real 6.016s user 0.002s sys 0.001s
I20250902 21:32:51.414707 9043 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:51.421551 9180 tablet_service.cc:1430] Tablet server has 0 leaders and 3 scanners
I20250902 21:32:51.430572 8786 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250902 21:32:51.442901 8921 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:51.798087 5497 ts_manager.cc:284] Unset tserver state for 3a35f7f28cb9438dbcfb3196e167fdc5 from MAINTENANCE_MODE
I20250902 21:32:51.798911 8987 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:51.880820 5496 ts_manager.cc:284] Unset tserver state for f79e0a34bf4d4181aed51bd155010a25 from MAINTENANCE_MODE
I20250902 21:32:51.887197 5496 ts_manager.cc:284] Unset tserver state for b41910bc980f4cfdbcc6eb23e1084325 from MAINTENANCE_MODE
I20250902 21:32:51.896905 5496 ts_manager.cc:284] Unset tserver state for 6311ff5fc63a49108dc3000117399229 from MAINTENANCE_MODE
I20250902 21:32:52.213394 5496 ts_manager.cc:295] Set tserver state for f79e0a34bf4d4181aed51bd155010a25 to MAINTENANCE_MODE
I20250902 21:32:52.245081 5496 ts_manager.cc:295] Set tserver state for b41910bc980f4cfdbcc6eb23e1084325 to MAINTENANCE_MODE
I20250902 21:32:52.260231 5496 ts_manager.cc:295] Set tserver state for 3a35f7f28cb9438dbcfb3196e167fdc5 to MAINTENANCE_MODE
I20250902 21:32:52.281224 5496 ts_manager.cc:295] Set tserver state for 6311ff5fc63a49108dc3000117399229 to MAINTENANCE_MODE
I20250902 21:32:52.555956 8786 tablet_service.cc:1423] Tablet server b41910bc980f4cfdbcc6eb23e1084325 set to quiescing
I20250902 21:32:52.556013 8786 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250902 21:32:52.556444 9305 raft_consensus.cc:991] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: : Instructing follower f79e0a34bf4d4181aed51bd155010a25 to start an election
I20250902 21:32:52.556517 9305 raft_consensus.cc:1079] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 10 LEADER]: Signalling peer f79e0a34bf4d4181aed51bd155010a25 to start an election
I20250902 21:32:52.556680 9206 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b"
dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
from {username='slave'} at 127.4.52.194:35187
I20250902 21:32:52.556762 9206 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 10 FOLLOWER]: Starting forced leader election (received explicit request)
I20250902 21:32:52.556788 9206 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 10 FOLLOWER]: Advancing to term 11
I20250902 21:32:52.557498 9206 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 11 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:52.557709 9206 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [CANDIDATE]: Term 11 election: Requested vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:52.558512 9207 raft_consensus.cc:1238] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 11 FOLLOWER]: Rejecting Update request from peer b41910bc980f4cfdbcc6eb23e1084325 for earlier term 10. Current term is 11. Ops: []
I20250902 21:32:52.558777 9295 consensus_queue.cc:1046] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 }, Status: INVALID_TERM, Last received: 10.18830, Next index: 18831, Last known committed idx: 18829, Time since last communication: 0.000s
I20250902 21:32:52.558873 9295 raft_consensus.cc:3053] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 10 LEADER]: Stepping down as leader of term 10
I20250902 21:32:52.558899 9295 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 10 LEADER]: Becoming Follower/Learner. State: Replica: b41910bc980f4cfdbcc6eb23e1084325, State: Running, Role: LEADER
I20250902 21:32:52.558945 9295 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 18830, Committed index: 18830, Last appended: 10.18831, Last appended by leader: 18831, Current term: 10, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:52.559058 9295 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 10 FOLLOWER]: Advancing to term 11
W20250902 21:32:52.559813 8756 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:52.559933 8765 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:52.561249 9033 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56996: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:52.562992 9033 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56996: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:52.563513 9072 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "f79e0a34bf4d4181aed51bd155010a25" candidate_term: 11 candidate_status { last_received { term: 10 index: 18830 } } ignore_live_leader: true dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
I20250902 21:32:52.563616 9072 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 10 FOLLOWER]: Advancing to term 11
I20250902 21:32:52.563972 8790 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "f79e0a34bf4d4181aed51bd155010a25" candidate_term: 11 candidate_status { last_received { term: 10 index: 18830 } } ignore_live_leader: true dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
I20250902 21:32:52.564100 8790 raft_consensus.cc:2408] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 11 FOLLOWER]: Leader election vote request: Denying vote to candidate f79e0a34bf4d4181aed51bd155010a25 for term 11 because replica has last-logged OpId of term: 10 index: 18831, which is greater than that of the candidate, which has last-logged OpId of term: 10 index: 18830.
I20250902 21:32:52.564281 9072 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 11 FOLLOWER]: Leader election vote request: Granting yes vote for candidate f79e0a34bf4d4181aed51bd155010a25 in term 11.
I20250902 21:32:52.564424 9140 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [CANDIDATE]: Term 11 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, f79e0a34bf4d4181aed51bd155010a25; no voters: b41910bc980f4cfdbcc6eb23e1084325
W20250902 21:32:52.565439 9166 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51606: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:52.567508 9166 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51606: Illegal state: replica f79e0a34bf4d4181aed51bd155010a25 is not leader of this config: current role FOLLOWER
W20250902 21:32:52.571480 8765 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:52.575286 8765 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:52.577958 9033 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56996: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:52.580705 9033 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56996: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:52.581517 9490 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 11 FOLLOWER]: Leader election won for term 11
I20250902 21:32:52.582321 9490 raft_consensus.cc:695] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 11 LEADER]: Becoming Leader. State: Replica: f79e0a34bf4d4181aed51bd155010a25, State: Running, Role: LEADER
I20250902 21:32:52.582464 9490 consensus_queue.cc:237] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 18829, Committed index: 18829, Last appended: 10.18830, Last appended by leader: 18830, Current term: 11, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:52.583834 5496 catalog_manager.cc:5582] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 reported cstate change: term changed from 10 to 11, leader changed from b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194) to f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195). New cstate: current_term: 11 leader_uuid: "f79e0a34bf4d4181aed51bd155010a25" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } health_report { overall_health: UNKNOWN } } }
I20250902 21:32:52.584559 9180 tablet_service.cc:1423] Tablet server f79e0a34bf4d4181aed51bd155010a25 set to quiescing
I20250902 21:32:52.584606 9180 tablet_service.cc:1430] Tablet server has 1 leaders and 3 scanners
I20250902 21:32:52.584856 9253 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:52.586056 9072 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 11 FOLLOWER]: Refusing update from remote peer f79e0a34bf4d4181aed51bd155010a25: Log matching property violated. Preceding OpId in replica: term: 10 index: 18830. Preceding OpId from leader: term: 11 index: 18832. (index mismatch)
I20250902 21:32:52.586170 8790 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 11 FOLLOWER]: Refusing update from remote peer f79e0a34bf4d4181aed51bd155010a25: Log matching property violated. Preceding OpId in replica: term: 10 index: 18831. Preceding OpId from leader: term: 11 index: 18832. (index mismatch)
I20250902 21:32:52.586391 9490 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 18831, Last known committed idx: 18829, Time since last communication: 0.000s
I20250902 21:32:52.586992 9490 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [LEADER]: Connected to new peer: Peer: permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 18831, Last known committed idx: 18830, Time since last communication: 0.000s
I20250902 21:32:52.587095 9490 consensus_queue.cc:1230] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [LEADER]: Peer b41910bc980f4cfdbcc6eb23e1084325 log is divergent from this leader: its last log entry 10.18831 is not in this leader's log and it has not received anything from this leader yet. Falling back to committed index 18830
I20250902 21:32:52.587442 8790 pending_rounds.cc:85] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Aborting all ops after (but not including) 18830
I20250902 21:32:52.587493 8790 pending_rounds.cc:107] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Aborting uncommitted WRITE_OP operation due to leader change: 10.18831
W20250902 21:32:52.587548 8790 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:48046: Aborted: Op aborted by new leader
I20250902 21:32:52.588042 9119 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:52.588726 8852 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:52.589727 9490 raft_consensus.cc:991] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: : Instructing follower 3a35f7f28cb9438dbcfb3196e167fdc5 to start an election
I20250902 21:32:52.589835 9490 raft_consensus.cc:1079] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 11 LEADER]: Signalling peer 3a35f7f28cb9438dbcfb3196e167fdc5 to start an election
I20250902 21:32:52.590538 9490 raft_consensus.cc:991] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: : Instructing follower 3a35f7f28cb9438dbcfb3196e167fdc5 to start an election
I20250902 21:32:52.590598 9072 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b"
dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
from {username='slave'} at 127.4.52.195:38435
I20250902 21:32:52.590672 9072 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 11 FOLLOWER]: Starting forced leader election (received explicit request)
I20250902 21:32:52.590698 9072 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 11 FOLLOWER]: Advancing to term 12
I20250902 21:32:52.590756 9490 raft_consensus.cc:1079] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 11 LEADER]: Signalling peer 3a35f7f28cb9438dbcfb3196e167fdc5 to start an election
I20250902 21:32:52.591496 9072 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 12 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:52.591730 9072 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 12 election: Requested vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:52.591817 9073 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b"
dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
from {username='slave'} at 127.4.52.195:38435
I20250902 21:32:52.591856 9073 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 12 FOLLOWER]: Starting forced leader election (received explicit request)
I20250902 21:32:52.591877 9073 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 12 FOLLOWER]: Advancing to term 13
I20250902 21:32:52.592546 9073 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:52.592641 9073 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 13 election: Requested vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:52.595198 9073 raft_consensus.cc:1238] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Rejecting Update request from peer f79e0a34bf4d4181aed51bd155010a25 for earlier term 11. Current term is 13. Ops: [11.18834-11.18836]
I20250902 21:32:52.595721 9493 consensus_queue.cc:1046] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 }, Status: INVALID_TERM, Last received: 11.18833, Next index: 18834, Last known committed idx: 18833, Time since last communication: 0.000s
I20250902 21:32:52.596194 9490 raft_consensus.cc:3053] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 11 LEADER]: Stepping down as leader of term 11
I20250902 21:32:52.596309 9490 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 11 LEADER]: Becoming Follower/Learner. State: Replica: f79e0a34bf4d4181aed51bd155010a25, State: Running, Role: LEADER
I20250902 21:32:52.596439 9490 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 18836, Committed index: 18836, Last appended: 11.18836, Last appended by leader: 18836, Current term: 11, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:52.596655 9490 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 11 FOLLOWER]: Advancing to term 13
I20250902 21:32:52.598815 9207 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 12 candidate_status { last_received { term: 11 index: 18833 } } ignore_live_leader: true dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
I20250902 21:32:52.598894 9207 raft_consensus.cc:2366] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 13 FOLLOWER]: Leader election vote request: Denying vote to candidate 3a35f7f28cb9438dbcfb3196e167fdc5 for earlier term 12. Current term is 13.
I20250902 21:32:52.598958 9206 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 13 candidate_status { last_received { term: 11 index: 18833 } } ignore_live_leader: true dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
I20250902 21:32:52.599011 9206 raft_consensus.cc:2408] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 13 FOLLOWER]: Leader election vote request: Denying vote to candidate 3a35f7f28cb9438dbcfb3196e167fdc5 for term 13 because replica has last-logged OpId of term: 11 index: 18836, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 18833.
I20250902 21:32:52.599123 9005 leader_election.cc:400] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 12 election: Vote denied by peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587) with higher term. Message: Invalid argument: T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 13 FOLLOWER]: Leader election vote request: Denying vote to candidate 3a35f7f28cb9438dbcfb3196e167fdc5 for earlier term 12. Current term is 13.
I20250902 21:32:52.599153 9005 leader_election.cc:403] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 12 election: Cancelling election due to peer responding with higher term
I20250902 21:32:52.599516 9495 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Leader election lost for term 12. Reason: Vote denied by peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587) with higher term. Message: Invalid argument: T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 13 FOLLOWER]: Leader election vote request: Denying vote to candidate 3a35f7f28cb9438dbcfb3196e167fdc5 for earlier term 12. Current term is 13.
I20250902 21:32:52.601233 8790 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 12 candidate_status { last_received { term: 11 index: 18833 } } ignore_live_leader: true dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
I20250902 21:32:52.601308 8790 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 11 FOLLOWER]: Advancing to term 12
I20250902 21:32:52.602329 8790 raft_consensus.cc:2408] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate 3a35f7f28cb9438dbcfb3196e167fdc5 for term 12 because replica has last-logged OpId of term: 11 index: 18836, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 18833.
I20250902 21:32:52.601246 8806 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 13 candidate_status { last_received { term: 11 index: 18833 } } ignore_live_leader: true dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325"
I20250902 21:32:52.602531 8806 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 12 FOLLOWER]: Advancing to term 13
I20250902 21:32:52.603153 8806 raft_consensus.cc:2408] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 13 FOLLOWER]: Leader election vote request: Denying vote to candidate 3a35f7f28cb9438dbcfb3196e167fdc5 for term 13 because replica has last-logged OpId of term: 11 index: 18836, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 18833.
I20250902 21:32:52.603451 9005 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 13 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5; no voters: b41910bc980f4cfdbcc6eb23e1084325, f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:52.603533 9495 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Leader election lost for term 13. Reason: could not achieve majority
I20250902 21:32:52.615861 9043 tablet_service.cc:1423] Tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 set to quiescing
I20250902 21:32:52.615924 9043 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:52.621189 8921 tablet_service.cc:1423] Tablet server 6311ff5fc63a49108dc3000117399229 set to quiescing
I20250902 21:32:52.621244 8921 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:52.799937 8987 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
W20250902 21:32:52.896872 9329 raft_consensus.cc:668] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: failed to trigger leader election: Illegal state: leader elections are disabled
W20250902 21:32:52.933531 9501 raft_consensus.cc:668] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: failed to trigger leader election: Illegal state: leader elections are disabled
W20250902 21:32:53.040382 9495 raft_consensus.cc:668] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: failed to trigger leader election: Illegal state: leader elections are disabled
I20250902 21:32:53.738943 9180 tablet_service.cc:1423] Tablet server f79e0a34bf4d4181aed51bd155010a25 set to quiescing
I20250902 21:32:53.738998 9180 tablet_service.cc:1430] Tablet server has 0 leaders and 1 scanners
I20250902 21:32:53.740131 8786 tablet_service.cc:1423] Tablet server b41910bc980f4cfdbcc6eb23e1084325 set to quiescing
I20250902 21:32:53.740195 8786 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:54.876729 9180 tablet_service.cc:1423] Tablet server f79e0a34bf4d4181aed51bd155010a25 set to quiescing
I20250902 21:32:54.876796 9180 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:54.932061 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 8723
I20250902 21:32:54.942942 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.194:33503
--local_ip_for_outbound_sockets=127.4.52.194
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=37969
--webserver_interface=127.4.52.194
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:55.015393 9554 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:55.015547 9554 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:55.015568 9554 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:55.016937 9554 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:55.016984 9554 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.194
I20250902 21:32:55.018371 9554 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.194:33503
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.4.52.194
--webserver_port=37969
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.9554
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.194
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:55.018563 9554 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:55.018790 9554 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:55.021036 9562 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:55.021036 9559 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:55.021155 9554 server_base.cc:1047] running on GCE node
W20250902 21:32:55.021056 9560 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:55.021404 9554 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:55.021592 9554 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:55.022584 6123 meta_cache.cc:1510] marking tablet server b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503) as failed
W20250902 21:32:55.022636 6123 meta_cache.cc:302] tablet 1bd11ec67527495b831ab711f8a2f39b: replica b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503) has failed: Network error: TS failed: Client connection negotiation failed: client connection to 127.4.52.194:33503: connect: Connection refused (error 111)
I20250902 21:32:55.022734 9554 hybrid_clock.cc:648] HybridClock initialized: now 1756848775022724 us; error 31 us; skew 500 ppm
I20250902 21:32:55.023705 9554 webserver.cc:480] Webserver started at http://127.4.52.194:37969/ using document root <none> and password file <none>
I20250902 21:32:55.023877 9554 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:55.023941 9554 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:55.024991 9554 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.000s sys 0.001s
I20250902 21:32:55.025571 9569 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:55.025718 9554 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.001s
I20250902 21:32:55.025785 9554 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
uuid: "b41910bc980f4cfdbcc6eb23e1084325"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:55.026007 9554 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:55.039117 9554 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:55.039304 9554 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:55.039420 9554 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:55.039604 9554 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:55.039940 9576 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:55.040638 9554 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:55.040675 9554 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:55.040706 9554 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:55.041167 9554 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:55.041195 9554 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:55.041247 9576 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap starting.
I20250902 21:32:55.046638 9554 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.194:33503
I20250902 21:32:55.046693 9683 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.194:33503 every 8 connection(s)
I20250902 21:32:55.046980 9554 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-1/data/info.pb
I20250902 21:32:55.051079 9684 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:55.051239 9684 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:55.051481 9684 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:55.052030 5496 ts_manager.cc:194] Re-registered known tserver with Master: b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:55.052558 5496 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.194:43349
I20250902 21:32:55.057008 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 9554
I20250902 21:32:55.057088 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 8856
I20250902 21:32:55.057706 9576 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:55.062098 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.193:34353
--local_ip_for_outbound_sockets=127.4.52.193
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=43863
--webserver_interface=127.4.52.193
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:55.137764 9689 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:55.137930 9689 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:55.137960 9689 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:55.139268 9689 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:55.139323 9689 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.193
I20250902 21:32:55.140702 9689 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.193:34353
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.4.52.193
--webserver_port=43863
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.9689
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.193
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:55.140908 9689 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:55.141109 9689 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:55.143337 9694 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:55.143477 9689 server_base.cc:1047] running on GCE node
W20250902 21:32:55.143427 9697 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:55.143354 9695 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:55.143723 9689 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:55.143906 9689 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:55.145041 9689 hybrid_clock.cc:648] HybridClock initialized: now 1756848775145031 us; error 26 us; skew 500 ppm
I20250902 21:32:55.146100 9689 webserver.cc:480] Webserver started at http://127.4.52.193:43863/ using document root <none> and password file <none>
I20250902 21:32:55.146322 9689 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:55.146368 9689 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:55.147493 9689 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:55.148027 9703 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:55.148241 9689 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:55.148324 9689 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
uuid: "6311ff5fc63a49108dc3000117399229"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:55.148638 9689 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:55.179802 9689 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:55.180084 9689 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:55.180207 9689 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:55.180455 9689 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:55.180809 9689 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250902 21:32:55.180855 9689 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:55.180897 9689 ts_tablet_manager.cc:610] Registered 0 tablets
I20250902 21:32:55.180917 9689 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:55.188007 9689 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.193:34353
I20250902 21:32:55.188037 9816 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.193:34353 every 8 connection(s)
I20250902 21:32:55.188396 9689 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-0/data/info.pb
I20250902 21:32:55.193171 9817 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:55.193277 9817 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:55.193465 9817 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:55.193796 5496 ts_manager.cc:194] Re-registered known tserver with Master: 6311ff5fc63a49108dc3000117399229 (127.4.52.193:34353)
I20250902 21:32:55.194286 5496 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.193:33967
I20250902 21:32:55.197124 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 9689
I20250902 21:32:55.197233 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 8990
I20250902 21:32:55.209182 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.196:35541
--local_ip_for_outbound_sockets=127.4.52.196
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=36885
--webserver_interface=127.4.52.196
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:55.304757 9820 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:55.304955 9820 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:55.304996 9820 flags.cc:425] Enabled unsafe flag: --never_fsync=true
I20250902 21:32:55.304994 6121 meta_cache.cc:1510] marking tablet server 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541) as failed
W20250902 21:32:55.306437 9820 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:55.306479 9820 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.196
I20250902 21:32:55.307792 9820 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.196:35541
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.4.52.196
--webserver_port=36885
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.9820
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.196
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:55.307943 9820 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:55.308130 9820 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:55.310271 9825 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:55.310271 9826 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:55.310449 9820 server_base.cc:1047] running on GCE node
W20250902 21:32:55.310293 9828 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:55.310632 9820 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:55.310822 9820 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:55.311980 9820 hybrid_clock.cc:648] HybridClock initialized: now 1756848775311973 us; error 24 us; skew 500 ppm
I20250902 21:32:55.313010 9820 webserver.cc:480] Webserver started at http://127.4.52.196:36885/ using document root <none> and password file <none>
I20250902 21:32:55.313197 9820 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:55.313235 9820 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:55.314314 9820 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:55.314913 9834 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:55.315096 9820 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.001s sys 0.000s
I20250902 21:32:55.315162 9820 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:55.315394 9820 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:55.330135 9820 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:55.330312 9820 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:55.330395 9820 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:55.330554 9820 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:55.330941 9841 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:55.331735 9820 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:55.331784 9820 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:55.331806 9820 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:55.332288 9820 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:55.332338 9820 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:55.332394 9841 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap starting.
I20250902 21:32:55.338819 9820 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.196:35541
I20250902 21:32:55.339128 9820 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-3/data/info.pb
I20250902 21:32:55.339504 9948 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.196:35541 every 8 connection(s)
I20250902 21:32:55.344250 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 9820
I20250902 21:32:55.344339 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 9122
I20250902 21:32:55.348398 9841 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:55.351052 9949 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:55.351186 9949 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:55.351372 9949 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:55.351929 5496 ts_manager.cc:194] Re-registered known tserver with Master: 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541)
I20250902 21:32:55.352433 5496 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.196:59411
I20250902 21:32:55.362002 4307 external_mini_cluster.cc:1366] Running /tmp/dist-test-task6wlXYv/build/release/bin/kudu
/tmp/dist-test-task6wlXYv/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.4.52.195:46587
--local_ip_for_outbound_sockets=127.4.52.195
--tserver_master_addrs=127.4.52.254:39913
--webserver_port=32789
--webserver_interface=127.4.52.195
--builtin_ntp_servers=127.4.52.212:42387
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20250902 21:32:55.473003 9953 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250902 21:32:55.473222 9953 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250902 21:32:55.473254 9953 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250902 21:32:55.475365 9953 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250902 21:32:55.475471 9953 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.4.52.195
I20250902 21:32:55.477633 9953 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.4.52.212:42387
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.4.52.195:46587
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.4.52.195
--webserver_port=32789
--tserver_master_addrs=127.4.52.254:39913
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.9953
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.4.52.195
--log_dir=/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 6ab3f249351a4ee5d73eb8bda3585c4dcd85d070
build type RELEASE
built by None at 02 Sep 2025 21:15:19 UTC on 5fd53c4cbb9d
build id 7867
I20250902 21:32:55.477954 9953 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250902 21:32:55.478194 9953 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250902 21:32:55.480945 9960 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:55.481060 9961 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250902 21:32:55.480964 9963 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250902 21:32:55.481647 9953 server_base.cc:1047] running on GCE node
I20250902 21:32:55.481807 9953 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250902 21:32:55.482038 9953 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250902 21:32:55.483181 9953 hybrid_clock.cc:648] HybridClock initialized: now 1756848775483159 us; error 36 us; skew 500 ppm
I20250902 21:32:55.484525 9953 webserver.cc:480] Webserver started at http://127.4.52.195:32789/ using document root <none> and password file <none>
I20250902 21:32:55.484745 9953 fs_manager.cc:362] Metadata directory not provided
I20250902 21:32:55.484800 9953 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250902 21:32:55.486373 9953 fs_manager.cc:714] Time spent opening directory manager: real 0.001s user 0.001s sys 0.000s
I20250902 21:32:55.487169 9969 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:55.487308 9953 fs_manager.cc:730] Time spent opening block manager: real 0.000s user 0.000s sys 0.000s
I20250902 21:32:55.487368 9953 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data,/tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
uuid: "f79e0a34bf4d4181aed51bd155010a25"
format_stamp: "Formatted at 2025-09-02 21:32:16 on dist-test-slave-jkp9"
I20250902 21:32:55.487630 9953 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250902 21:32:55.506623 9953 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250902 21:32:55.506865 9953 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250902 21:32:55.506978 9953 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250902 21:32:55.507253 9953 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250902 21:32:55.507783 9976 ts_tablet_manager.cc:536] Loading tablet metadata (0/1 complete)
I20250902 21:32:55.508811 9953 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250902 21:32:55.508860 9953 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:55.508898 9953 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250902 21:32:55.509565 9953 ts_tablet_manager.cc:610] Registered 1 tablets
I20250902 21:32:55.509606 9953 ts_tablet_manager.cc:589] Time spent register tablets: real 0.001s user 0.000s sys 0.000s
I20250902 21:32:55.509739 9976 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap starting.
I20250902 21:32:55.516103 9953 rpc_server.cc:307] RPC server started. Bound to: 127.4.52.195:46587
I20250902 21:32:55.516502 9953 server_base.cc:1179] Dumped server information to /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0/minicluster-data/ts-2/data/info.pb
I20250902 21:32:55.518867 4307 external_mini_cluster.cc:1428] Started /tmp/dist-test-task6wlXYv/build/release/bin/kudu as pid 9953
I20250902 21:32:55.527853 10083 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.4.52.195:46587 every 8 connection(s)
I20250902 21:32:55.530025 9976 log.cc:826] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Log is configured to *not* fsync() on all Append() calls
I20250902 21:32:55.540427 10084 heartbeater.cc:344] Connected to a master server at 127.4.52.254:39913
I20250902 21:32:55.540531 10084 heartbeater.cc:461] Registering TS with master...
I20250902 21:32:55.540705 10084 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:32:55.541252 5496 ts_manager.cc:194] Re-registered known tserver with Master: f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:55.541750 5496 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.4.52.195:52505
I20250902 21:32:55.690289 9883 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:55.695595 9751 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:55.703110 9618 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:55.707329 10018 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:32:55.938730 9576 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 1/5 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:56.053416 9684 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:56.194999 9817 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:56.353204 9949 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:56.491757 9841 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 1/5 log segments. Stats: ops{read=4623 overwritten=0 applied=4620 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:56.542548 10084 heartbeater.cc:499] Master 127.4.52.254:39913 was elected leader, sending a full tablet report...
I20250902 21:32:56.777477 9976 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 1/5 log segments. Stats: ops{read=4623 overwritten=0 applied=4620 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:56.958961 9576 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 2/5 log segments. Stats: ops{read=9246 overwritten=0 applied=9243 ignored=0} inserts{seen=461900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:57.285128 9841 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 2/5 log segments. Stats: ops{read=9245 overwritten=0 applied=9242 ignored=0} inserts{seen=461850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:58.055538 9976 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 2/5 log segments. Stats: ops{read=9245 overwritten=0 applied=9242 ignored=0} inserts{seen=461850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:58.099736 9841 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 3/5 log segments. Stats: ops{read=13867 overwritten=0 applied=13864 ignored=0} inserts{seen=692900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:58.267681 9576 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 3/5 log segments. Stats: ops{read=13867 overwritten=0 applied=13864 ignored=0} inserts{seen=692900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:58.947913 9841 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 4/5 log segments. Stats: ops{read=18580 overwritten=0 applied=18577 ignored=0} inserts{seen=928500 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:32:58.991046 9841 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap replayed 5/5 log segments. Stats: ops{read=18833 overwritten=0 applied=18833 ignored=0} inserts{seen=941250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250902 21:32:58.991602 9841 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Bootstrap complete.
I20250902 21:32:58.998698 9841 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent bootstrapping tablet: real 3.666s user 3.140s sys 0.511s
I20250902 21:32:58.999593 9841 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:58.999810 9841 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Becoming Follower/Learner. State: Replica: 3a35f7f28cb9438dbcfb3196e167fdc5, State: Initialized, Role: FOLLOWER
I20250902 21:32:58.999953 9841 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 18833, Last appended: 11.18833, Last appended by leader: 18833, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:59.000224 9841 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5: Time spent starting tablet: real 0.001s user 0.001s sys 0.000s
W20250902 21:32:59.130640 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.152356 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.193640 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:59.249739 9976 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 3/5 log segments. Stats: ops{read=13866 overwritten=0 applied=13864 ignored=0} inserts{seen=692900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20250902 21:32:59.319485 10129 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:59.319622 10129 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:59.319901 10129 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 14 pre-election: Requested pre-vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:59.323685 9634 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 14 candidate_status { last_received { term: 11 index: 18833 } } ignore_live_leader: false dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325" is_pre_election: true
I20250902 21:32:59.323779 10031 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 14 candidate_status { last_received { term: 11 index: 18833 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
W20250902 21:32:59.324748 9835 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 14 pre-election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
W20250902 21:32:59.324935 9835 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 14 pre-election: Tablet error from VoteRequest() call to peer b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:59.324998 9835 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 14 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5; no voters: b41910bc980f4cfdbcc6eb23e1084325, f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:59.325111 10129 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Leader pre-election lost for term 14. Reason: could not achieve majority
I20250902 21:32:59.384004 9576 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 4/5 log segments. Stats: ops{read=18484 overwritten=0 applied=18483 ignored=0} inserts{seen=923800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
W20250902 21:32:59.395164 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.414263 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:59.444049 9576 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap replayed 5/5 log segments. Stats: ops{read=18837 overwritten=1 applied=18836 ignored=0} inserts{seen=941400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250902 21:32:59.444558 9576 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Bootstrap complete.
I20250902 21:32:59.451735 9576 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent bootstrapping tablet: real 4.411s user 3.822s sys 0.583s
I20250902 21:32:59.452255 9576 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 13 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:59.452443 9576 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 13 FOLLOWER]: Becoming Follower/Learner. State: Replica: b41910bc980f4cfdbcc6eb23e1084325, State: Initialized, Role: FOLLOWER
I20250902 21:32:59.452541 9576 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 18836, Last appended: 11.18836, Last appended by leader: 18836, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:59.452781 9576 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325: Time spent starting tablet: real 0.001s user 0.002s sys 0.000s
W20250902 21:32:59.461478 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.505698 9598 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.507316 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.523363 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.528980 9598 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.536623 6122 scanner-internal.cc:458] Time spent opening tablet: real 5.707s user 0.001s sys 0.001s
W20250902 21:32:59.549264 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.555763 9598 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.558951 9598 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.560403 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.571851 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.572824 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.582326 9598 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.584229 9597 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.599654 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.600014 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.609514 9598 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.632812 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.633272 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.647840 9598 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.650846 9598 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.662889 9597 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.669893 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.677606 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.680001 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.688426 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.688757 9597 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:59.689724 10129 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:59.689785 10129 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:59.689913 10129 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 14 pre-election: Requested pre-vote from peers f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587), b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194:33503)
I20250902 21:32:59.690085 10031 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 14 candidate_status { last_received { term: 11 index: 18833 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
I20250902 21:32:59.690111 9634 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" candidate_term: 14 candidate_status { last_received { term: 11 index: 18833 } } ignore_live_leader: false dest_uuid: "b41910bc980f4cfdbcc6eb23e1084325" is_pre_election: true
I20250902 21:32:59.690202 9634 raft_consensus.cc:2408] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 13 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 3a35f7f28cb9438dbcfb3196e167fdc5 for term 14 because replica has last-logged OpId of term: 11 index: 18836, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 18833.
W20250902 21:32:59.690390 9835 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 14 pre-election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:59.690443 9835 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [CANDIDATE]: Term 14 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5; no voters: b41910bc980f4cfdbcc6eb23e1084325, f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:59.690539 10129 raft_consensus.cc:2747] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Leader pre-election lost for term 14. Reason: could not achieve majority
W20250902 21:32:59.695936 9597 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.706004 9598 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.707135 9598 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.718699 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
W20250902 21:32:59.730589 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:59.739588 10136 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 13 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250902 21:32:59.739696 10136 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 13 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:59.739966 10136 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 14 pre-election: Requested pre-vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
W20250902 21:32:59.743234 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:59.744782 10031 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 14 candidate_status { last_received { term: 11 index: 18836 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25" is_pre_election: true
W20250902 21:32:59.745041 9570 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 14 pre-election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
W20250902 21:32:59.748214 9854 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:58974: Illegal state: replica 3a35f7f28cb9438dbcfb3196e167fdc5 is not leader of this config: current role FOLLOWER
I20250902 21:32:59.750262 9903 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 14 candidate_status { last_received { term: 11 index: 18836 } } ignore_live_leader: false dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" is_pre_election: true
I20250902 21:32:59.750370 9903 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate b41910bc980f4cfdbcc6eb23e1084325 in term 13.
I20250902 21:32:59.750557 9572 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 14 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, b41910bc980f4cfdbcc6eb23e1084325; no voters: f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:59.750710 10136 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 13 FOLLOWER]: Leader pre-election won for term 14
I20250902 21:32:59.750795 10136 raft_consensus.cc:491] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 13 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250902 21:32:59.750828 10136 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 13 FOLLOWER]: Advancing to term 14
I20250902 21:32:59.752079 10136 raft_consensus.cc:513] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 14 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:59.752214 10136 leader_election.cc:290] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 14 election: Requested vote from peers 3a35f7f28cb9438dbcfb3196e167fdc5 (127.4.52.196:35541), f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587)
I20250902 21:32:59.752485 10031 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 14 candidate_status { last_received { term: 11 index: 18836 } } ignore_live_leader: false dest_uuid: "f79e0a34bf4d4181aed51bd155010a25"
I20250902 21:32:59.752728 9903 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "1bd11ec67527495b831ab711f8a2f39b" candidate_uuid: "b41910bc980f4cfdbcc6eb23e1084325" candidate_term: 14 candidate_status { last_received { term: 11 index: 18836 } } ignore_live_leader: false dest_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5"
I20250902 21:32:59.752794 9903 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 13 FOLLOWER]: Advancing to term 14
W20250902 21:32:59.752779 9598 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:43972: Illegal state: replica b41910bc980f4cfdbcc6eb23e1084325 is not leader of this config: current role FOLLOWER
I20250902 21:32:59.753914 9903 raft_consensus.cc:2466] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 14 FOLLOWER]: Leader election vote request: Granting yes vote for candidate b41910bc980f4cfdbcc6eb23e1084325 in term 14.
W20250902 21:32:59.754051 9570 leader_election.cc:343] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 14 election: Tablet error from VoteRequest() call to peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Illegal state: must be running to vote when last-logged opid is not known
I20250902 21:32:59.754096 9570 leader_election.cc:304] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [CANDIDATE]: Term 14 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 3a35f7f28cb9438dbcfb3196e167fdc5, b41910bc980f4cfdbcc6eb23e1084325; no voters: f79e0a34bf4d4181aed51bd155010a25
I20250902 21:32:59.754314 10136 raft_consensus.cc:2802] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 14 FOLLOWER]: Leader election won for term 14
I20250902 21:32:59.754455 10136 raft_consensus.cc:695] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [term 14 LEADER]: Becoming Leader. State: Replica: b41910bc980f4cfdbcc6eb23e1084325, State: Running, Role: LEADER
I20250902 21:32:59.754524 10136 consensus_queue.cc:237] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 18836, Committed index: 18836, Last appended: 11.18836, Last appended by leader: 18836, Current term: 14, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:32:59.755165 5496 catalog_manager.cc:5582] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 reported cstate change: term changed from 11 to 14, leader changed from f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195) to b41910bc980f4cfdbcc6eb23e1084325 (127.4.52.194). New cstate: current_term: 14 leader_uuid: "b41910bc980f4cfdbcc6eb23e1084325" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } health_report { overall_health: HEALTHY } } }
W20250902 21:32:59.767552 9570 consensus_peers.cc:489] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 -> Peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Couldn't send request to peer f79e0a34bf4d4181aed51bd155010a25. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 1: this message will repeat every 5th retry.
I20250902 21:32:59.767606 9903 raft_consensus.cc:1273] T 1bd11ec67527495b831ab711f8a2f39b P 3a35f7f28cb9438dbcfb3196e167fdc5 [term 14 FOLLOWER]: Refusing update from remote peer b41910bc980f4cfdbcc6eb23e1084325: Log matching property violated. Preceding OpId in replica: term: 11 index: 18833. Preceding OpId from leader: term: 14 index: 18838. (index mismatch)
I20250902 21:32:59.767854 10136 consensus_queue.cc:1035] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Connected to new peer: Peer: permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 18837, Last known committed idx: 18833, Time since last communication: 0.000s
I20250902 21:32:59.771108 10148 mvcc.cc:204] Tried to move back new op lower bound from 7196052601925496832 to 7196052601875005440. Current Snapshot: MvccSnapshot[applied={T|T < 7196052601925496832}]
W20250902 21:32:59.808241 6121 scanner-internal.cc:458] Time spent opening tablet: real 5.707s user 0.001s sys 0.001s
W20250902 21:32:59.826333 6123 scanner-internal.cc:458] Time spent opening tablet: real 6.007s user 0.001s sys 0.001s
I20250902 21:33:00.322347 10157 consensus_queue.cc:786] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 [LEADER]: Peer f79e0a34bf4d4181aed51bd155010a25 is lagging by at least 52 ops behind the committed index
W20250902 21:33:00.326598 9570 consensus_peers.cc:489] T 1bd11ec67527495b831ab711f8a2f39b P b41910bc980f4cfdbcc6eb23e1084325 -> Peer f79e0a34bf4d4181aed51bd155010a25 (127.4.52.195:46587): Couldn't send request to peer f79e0a34bf4d4181aed51bd155010a25. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 6: this message will repeat every 5th retry.
I20250902 21:33:00.408505 9976 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 4/5 log segments. Stats: ops{read=18565 overwritten=0 applied=18565 ignored=0} inserts{seen=927900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250902 21:33:00.494020 9976 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap replayed 5/5 log segments. Stats: ops{read=18836 overwritten=0 applied=18833 ignored=0} inserts{seen=941250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20250902 21:33:00.494805 9976 tablet_bootstrap.cc:492] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Bootstrap complete.
I20250902 21:33:00.504702 9976 ts_tablet_manager.cc:1397] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent bootstrapping tablet: real 4.995s user 4.326s sys 0.608s
I20250902 21:33:00.505445 9976 raft_consensus.cc:357] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 13 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:33:00.506433 9976 raft_consensus.cc:738] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 13 FOLLOWER]: Becoming Follower/Learner. State: Replica: f79e0a34bf4d4181aed51bd155010a25, State: Initialized, Role: FOLLOWER
I20250902 21:33:00.506628 9976 consensus_queue.cc:260] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 18833, Last appended: 11.18836, Last appended by leader: 18836, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "3a35f7f28cb9438dbcfb3196e167fdc5" member_type: VOTER last_known_addr { host: "127.4.52.196" port: 35541 } } peers { permanent_uuid: "f79e0a34bf4d4181aed51bd155010a25" member_type: VOTER last_known_addr { host: "127.4.52.195" port: 46587 } } peers { permanent_uuid: "b41910bc980f4cfdbcc6eb23e1084325" member_type: VOTER last_known_addr { host: "127.4.52.194" port: 33503 } }
I20250902 21:33:00.506947 9976 ts_tablet_manager.cc:1428] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25: Time spent starting tablet: real 0.002s user 0.001s sys 0.001s
I20250902 21:33:00.538479 10031 raft_consensus.cc:3058] T 1bd11ec67527495b831ab711f8a2f39b P f79e0a34bf4d4181aed51bd155010a25 [term 13 FOLLOWER]: Advancing to term 14
I20250902 21:33:00.610483 10162 mvcc.cc:204] Tried to move back new op lower bound from 7196052603922415616 to 7196052601875005440. Current Snapshot: MvccSnapshot[applied={T|T < 7196052603847675904}]
I20250902 21:33:00.949302 9883 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:33:00.959715 9751 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:33:00.963076 10018 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250902 21:33:00.972631 9618 tablet_service.cc:1430] Tablet server has 1 leaders and 3 scanners
I20250902 21:33:01.259060 5496 ts_manager.cc:284] Unset tserver state for f79e0a34bf4d4181aed51bd155010a25 from MAINTENANCE_MODE
I20250902 21:33:01.262209 5497 ts_manager.cc:284] Unset tserver state for 6311ff5fc63a49108dc3000117399229 from MAINTENANCE_MODE
I20250902 21:33:01.382799 5497 ts_manager.cc:284] Unset tserver state for 3a35f7f28cb9438dbcfb3196e167fdc5 from MAINTENANCE_MODE
I20250902 21:33:01.383639 5496 ts_manager.cc:284] Unset tserver state for b41910bc980f4cfdbcc6eb23e1084325 from MAINTENANCE_MODE
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/maintenance_mode-itest.cc:751: Failure
Value of: s.ok()
Actual: true
Expected: false
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/test_util.cc:403: Failure
Failed
Timed out waiting for assertion to pass.
I20250902 21:33:01.571808 10084 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:33:01.588172 9684 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:33:01.774158 9949 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:33:02.200122 9817 heartbeater.cc:507] Master 127.4.52.254:39913 requested a full tablet report, sending...
I20250902 21:33:03.007683 4307 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20250902 21:33:03.007782 4307 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 6311ff5fc63a49108dc3000117399229 and pid 9689
W20250902 21:33:03.007845 6092 client.cc:2040] Couldn't close scanner e9a581f7985a4294922d1bc61d662570: Service unavailable: reactor is shutting down (error 108)
W20250902 21:33:03.011276 9571 connection.cc:537] server connection from 127.0.0.1:43972 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
************************ BEGIN STACKS **************************
[New LWP 9690]
[New LWP 9691]
[New LWP 9692]
[New LWP 9693]
[New LWP 9699]
[New LWP 9700]
[New LWP 9701]
[New LWP 9704]
[New LWP 9705]
[New LWP 9706]
[New LWP 9707]
[New LWP 9708]
[New LWP 9709]
[New LWP 9710]
[New LWP 9711]
[New LWP 9712]
[New LWP 9713]
[New LWP 9714]
[New LWP 9715]
[New LWP 9716]
[New LWP 9717]
[New LWP 9718]
[New LWP 9719]
[New LWP 9720]
[New LWP 9721]
[New LWP 9722]
[New LWP 9723]
[New LWP 9724]
[New LWP 9725]
[New LWP 9726]
[New LWP 9727]
[New LWP 9728]
[New LWP 9729]
[New LWP 9730]
[New LWP 9731]
[New LWP 9732]
[New LWP 9733]
[New LWP 9734]
[New LWP 9735]
[New LWP 9736]
[New LWP 9737]
[New LWP 9738]
[New LWP 9739]
[New LWP 9740]
[New LWP 9741]
[New LWP 9742]
[New LWP 9743]
[New LWP 9744]
[New LWP 9745]
[New LWP 9746]
[New LWP 9747]
[New LWP 9748]
[New LWP 9749]
[New LWP 9750]
[New LWP 9751]
[New LWP 9752]
[New LWP 9753]
[New LWP 9754]
[New LWP 9755]
[New LWP 9756]
[New LWP 9757]
[New LWP 9758]
[New LWP 9759]
[New LWP 9760]
[New LWP 9761]
[New LWP 9762]
[New LWP 9763]
[New LWP 9764]
[New LWP 9765]
[New LWP 9766]
[New LWP 9767]
[New LWP 9768]
[New LWP 9769]
[New LWP 9770]
[New LWP 9771]
[New LWP 9772]
[New LWP 9773]
[New LWP 9774]
[New LWP 9775]
[New LWP 9776]
[New LWP 9777]
[New LWP 9778]
[New LWP 9779]
[New LWP 9780]
[New LWP 9781]
[New LWP 9782]
[New LWP 9783]
[New LWP 9784]
[New LWP 9785]
[New LWP 9786]
[New LWP 9787]
[New LWP 9788]
[New LWP 9789]
[New LWP 9790]
[New LWP 9791]
[New LWP 9792]
[New LWP 9793]
[New LWP 9794]
[New LWP 9795]
[New LWP 9796]
[New LWP 9797]
[New LWP 9798]
[New LWP 9799]
[New LWP 9800]
[New LWP 9801]
[New LWP 9802]
[New LWP 9803]
[New LWP 9804]
[New LWP 9805]
[New LWP 9806]
[New LWP 9807]
[New LWP 9808]
[New LWP 9809]
[New LWP 9810]
[New LWP 9811]
[New LWP 9812]
[New LWP 9813]
[New LWP 9814]
[New LWP 9815]
[New LWP 9816]
[New LWP 9817]
[New LWP 9818]
0x00007f6ca01ebd50 in ?? ()
Id Target Id Frame
* 1 LWP 9689 "kudu" 0x00007f6ca01ebd50 in ?? ()
2 LWP 9690 "kudu" 0x00007f6ca01e7fb9 in ?? ()
3 LWP 9691 "kudu" 0x00007f6ca01e7fb9 in ?? ()
4 LWP 9692 "kudu" 0x00007f6ca01e7fb9 in ?? ()
5 LWP 9693 "kernel-watcher-" 0x00007f6ca01e7fb9 in ?? ()
6 LWP 9699 "ntp client-9699" 0x00007f6ca01eb9e2 in ?? ()
7 LWP 9700 "file cache-evic" 0x00007f6ca01e7fb9 in ?? ()
8 LWP 9701 "sq_acceptor" 0x00007f6c9e32bcb9 in ?? ()
9 LWP 9704 "rpc reactor-970" 0x00007f6c9e338a47 in ?? ()
10 LWP 9705 "rpc reactor-970" 0x00007f6c9e338a47 in ?? ()
11 LWP 9706 "rpc reactor-970" 0x00007f6c9e338a47 in ?? ()
12 LWP 9707 "rpc reactor-970" 0x00007f6c9e338a47 in ?? ()
13 LWP 9708 "MaintenanceMgr " 0x00007f6ca01e7ad3 in ?? ()
14 LWP 9709 "txn-status-mana" 0x00007f6ca01e7fb9 in ?? ()
15 LWP 9710 "collect_and_rem" 0x00007f6ca01e7fb9 in ?? ()
16 LWP 9711 "tc-session-exp-" 0x00007f6ca01e7fb9 in ?? ()
17 LWP 9712 "rpc worker-9712" 0x00007f6ca01e7ad3 in ?? ()
18 LWP 9713 "rpc worker-9713" 0x00007f6ca01e7ad3 in ?? ()
19 LWP 9714 "rpc worker-9714" 0x00007f6ca01e7ad3 in ?? ()
20 LWP 9715 "rpc worker-9715" 0x00007f6ca01e7ad3 in ?? ()
21 LWP 9716 "rpc worker-9716" 0x00007f6ca01e7ad3 in ?? ()
22 LWP 9717 "rpc worker-9717" 0x00007f6ca01e7ad3 in ?? ()
23 LWP 9718 "rpc worker-9718" 0x00007f6ca01e7ad3 in ?? ()
24 LWP 9719 "rpc worker-9719" 0x00007f6ca01e7ad3 in ?? ()
25 LWP 9720 "rpc worker-9720" 0x00007f6ca01e7ad3 in ?? ()
26 LWP 9721 "rpc worker-9721" 0x00007f6ca01e7ad3 in ?? ()
27 LWP 9722 "rpc worker-9722" 0x00007f6ca01e7ad3 in ?? ()
28 LWP 9723 "rpc worker-9723" 0x00007f6ca01e7ad3 in ?? ()
29 LWP 9724 "rpc worker-9724" 0x00007f6ca01e7ad3 in ?? ()
30 LWP 9725 "rpc worker-9725" 0x00007f6ca01e7ad3 in ?? ()
31 LWP 9726 "rpc worker-9726" 0x00007f6ca01e7ad3 in ?? ()
32 LWP 9727 "rpc worker-9727" 0x00007f6ca01e7ad3 in ?? ()
33 LWP 9728 "rpc worker-9728" 0x00007f6ca01e7ad3 in ?? ()
34 LWP 9729 "rpc worker-9729" 0x00007f6ca01e7ad3 in ?? ()
35 LWP 9730 "rpc worker-9730" 0x00007f6ca01e7ad3 in ?? ()
36 LWP 9731 "rpc worker-9731" 0x00007f6ca01e7ad3 in ?? ()
37 LWP 9732 "rpc worker-9732" 0x00007f6ca01e7ad3 in ?? ()
38 LWP 9733 "rpc worker-9733" 0x00007f6ca01e7ad3 in ?? ()
39 LWP 9734 "rpc worker-9734" 0x00007f6ca01e7ad3 in ?? ()
40 LWP 9735 "rpc worker-9735" 0x00007f6ca01e7ad3 in ?? ()
41 LWP 9736 "rpc worker-9736" 0x00007f6ca01e7ad3 in ?? ()
42 LWP 9737 "rpc worker-9737" 0x00007f6ca01e7ad3 in ?? ()
43 LWP 9738 "rpc worker-9738" 0x00007f6ca01e7ad3 in ?? ()
44 LWP 9739 "rpc worker-9739" 0x00007f6ca01e7ad3 in ?? ()
45 LWP 9740 "rpc worker-9740" 0x00007f6ca01e7ad3 in ?? ()
46 LWP 9741 "rpc worker-9741" 0x00007f6ca01e7ad3 in ?? ()
47 LWP 9742 "rpc worker-9742" 0x00007f6ca01e7ad3 in ?? ()
48 LWP 9743 "rpc worker-9743" 0x00007f6ca01e7ad3 in ?? ()
49 LWP 9744 "rpc worker-9744" 0x00007f6ca01e7ad3 in ?? ()
50 LWP 9745 "rpc worker-9745" 0x00007f6ca01e7ad3 in ?? ()
51 LWP 9746 "rpc worker-9746" 0x00007f6ca01e7ad3 in ?? ()
52 LWP 9747 "rpc worker-9747" 0x00007f6ca01e7ad3 in ?? ()
53 LWP 9748 "rpc worker-9748" 0x00007f6ca01e7ad3 in ?? ()
54 LWP 9749 "rpc worker-9749" 0x00007f6ca01e7ad3 in ?? ()
55 LWP 9750 "rpc worker-9750" 0x00007f6ca01e7ad3 in ?? ()
56 LWP 9751 "rpc worker-9751" 0x00007f6ca01e7ad3 in ?? ()
57 LWP 9752 "rpc worker-9752" 0x00007f6ca01e7ad3 in ?? ()
58 LWP 9753 "rpc worker-9753" 0x00007f6ca01e7ad3 in ?? ()
59 LWP 9754 "rpc worker-9754" 0x00007f6ca01e7ad3 in ?? ()
60 LWP 9755 "rpc worker-9755" 0x00007f6ca01e7ad3 in ?? ()
61 LWP 9756 "rpc worker-9756" 0x00007f6ca01e7ad3 in ?? ()
62 LWP 9757 "rpc worker-9757" 0x00007f6ca01e7ad3 in ?? ()
63 LWP 9758 "rpc worker-9758" 0x00007f6ca01e7ad3 in ?? ()
64 LWP 9759 "rpc worker-9759" 0x00007f6ca01e7ad3 in ?? ()
65 LWP 9760 "rpc worker-9760" 0x00007f6ca01e7ad3 in ?? ()
66 LWP 9761 "rpc worker-9761" 0x00007f6ca01e7ad3 in ?? ()
67 LWP 9762 "rpc worker-9762" 0x00007f6ca01e7ad3 in ?? ()
68 LWP 9763 "rpc worker-9763" 0x00007f6ca01e7ad3 in ?? ()
69 LWP 9764 "rpc worker-9764" 0x00007f6ca01e7ad3 in ?? ()
70 LWP 9765 "rpc worker-9765" 0x00007f6ca01e7ad3 in ?? ()
71 LWP 9766 "rpc worker-9766" 0x00007f6ca01e7ad3 in ?? ()
72 LWP 9767 "rpc worker-9767" 0x00007f6ca01e7ad3 in ?? ()
73 LWP 9768 "rpc worker-9768" 0x00007f6ca01e7ad3 in ?? ()
74 LWP 9769 "rpc worker-9769" 0x00007f6ca01e7ad3 in ?? ()
75 LWP 9770 "rpc worker-9770" 0x00007f6ca01e7ad3 in ?? ()
76 LWP 9771 "rpc worker-9771" 0x00007f6ca01e7ad3 in ?? ()
77 LWP 9772 "rpc worker-9772" 0x00007f6ca01e7ad3 in ?? ()
78 LWP 9773 "rpc worker-9773" 0x00007f6ca01e7ad3 in ?? ()
79 LWP 9774 "rpc worker-9774" 0x00007f6ca01e7ad3 in ?? ()
80 LWP 9775 "rpc worker-9775" 0x00007f6ca01e7ad3 in ?? ()
81 LWP 9776 "rpc worker-9776" 0x00007f6ca01e7ad3 in ?? ()
82 LWP 9777 "rpc worker-9777" 0x00007f6ca01e7ad3 in ?? ()
83 LWP 9778 "rpc worker-9778" 0x00007f6ca01e7ad3 in ?? ()
84 LWP 9779 "rpc worker-9779" 0x00007f6ca01e7ad3 in ?? ()
85 LWP 9780 "rpc worker-9780" 0x00007f6ca01e7ad3 in ?? ()
86 LWP 9781 "rpc worker-9781" 0x00007f6ca01e7ad3 in ?? ()
87 LWP 9782 "rpc worker-9782" 0x00007f6ca01e7ad3 in ?? ()
88 LWP 9783 "rpc worker-9783" 0x00007f6ca01e7ad3 in ?? ()
89 LWP 9784 "rpc worker-9784" 0x00007f6ca01e7ad3 in ?? ()
90 LWP 9785 "rpc worker-9785" 0x00007f6ca01e7ad3 in ?? ()
91 LWP 9786 "rpc worker-9786" 0x00007f6ca01e7ad3 in ?? ()
92 LWP 9787 "rpc worker-9787" 0x00007f6ca01e7ad3 in ?? ()
93 LWP 9788 "rpc worker-9788" 0x00007f6ca01e7ad3 in ?? ()
94 LWP 9789 "rpc worker-9789" 0x00007f6ca01e7ad3 in ?? ()
95 LWP 9790 "rpc worker-9790" 0x00007f6ca01e7ad3 in ?? ()
96 LWP 9791 "rpc worker-9791" 0x00007f6ca01e7ad3 in ?? ()
97 LWP 9792 "rpc worker-9792" 0x00007f6ca01e7ad3 in ?? ()
98 LWP 9793 "rpc worker-9793" 0x00007f6ca01e7ad3 in ?? ()
99 LWP 9794 "rpc worker-9794" 0x00007f6ca01e7ad3 in ?? ()
100 LWP 9795 "rpc worker-9795" 0x00007f6ca01e7ad3 in ?? ()
101 LWP 9796 "rpc worker-9796" 0x00007f6ca01e7ad3 in ?? ()
102 LWP 9797 "rpc worker-9797" 0x00007f6ca01e7ad3 in ?? ()
103 LWP 9798 "rpc worker-9798" 0x00007f6ca01e7ad3 in ?? ()
104 LWP 9799 "rpc worker-9799" 0x00007f6ca01e7ad3 in ?? ()
105 LWP 9800 "rpc worker-9800" 0x00007f6ca01e7ad3 in ?? ()
106 LWP 9801 "rpc worker-9801" 0x00007f6ca01e7ad3 in ?? ()
107 LWP 9802 "rpc worker-9802" 0x00007f6ca01e7ad3 in ?? ()
108 LWP 9803 "rpc worker-9803" 0x00007f6ca01e7ad3 in ?? ()
109 LWP 9804 "rpc worker-9804" 0x00007f6ca01e7ad3 in ?? ()
110 LWP 9805 "rpc worker-9805" 0x00007f6ca01e7ad3 in ?? ()
111 LWP 9806 "rpc worker-9806" 0x00007f6ca01e7ad3 in ?? ()
112 LWP 9807 "rpc worker-9807" 0x00007f6ca01e7ad3 in ?? ()
113 LWP 9808 "rpc worker-9808" 0x00007f6ca01e7ad3 in ?? ()
114 LWP 9809 "rpc worker-9809" 0x00007f6ca01e7ad3 in ?? ()
115 LWP 9810 "rpc worker-9810" 0x00007f6ca01e7ad3 in ?? ()
116 LWP 9811 "rpc worker-9811" 0x00007f6ca01e7ad3 in ?? ()
117 LWP 9812 "diag-logger-981" 0x00007f6ca01e7fb9 in ?? ()
118 LWP 9813 "result-tracker-" 0x00007f6ca01e7fb9 in ?? ()
119 LWP 9814 "excess-log-dele" 0x00007f6ca01e7fb9 in ?? ()
120 LWP 9815 "tcmalloc-memory" 0x00007f6ca01e7fb9 in ?? ()
121 LWP 9816 "acceptor-9816" 0x00007f6c9e33a0c7 in ?? ()
122 LWP 9817 "heartbeat-9817" 0x00007f6ca01e7fb9 in ?? ()
123 LWP 9818 "maintenance_sch" 0x00007f6ca01e7fb9 in ?? ()
Thread 123 (LWP 9818):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000020 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055736050be50 in ?? ()
#5 0x00007f6c574c6470 in ?? ()
#6 0x0000000000000040 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 9817):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000557360485930 in ?? ()
#5 0x00007f6c57cc73f0 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 121 (LWP 9816):
#0 0x00007f6c9e33a0c7 in ?? ()
#1 0x00007f6c584c8010 in ?? ()
#2 0x00007f6c9fe9a2f2 in ?? ()
#3 0x00007f6c584c8010 in ?? ()
#4 0x0000000000080800 in ?? ()
#5 0x00007f6c584c83d0 in ?? ()
#6 0x00007f6c584c8080 in ?? ()
#7 0x0000557360430c48 in ?? ()
#8 0x00007f6c9fe9fd89 in ?? ()
#9 0x00007f6c584c8500 in ?? ()
#10 0x00007f6c584c8700 in ?? ()
#11 0x00000080584c83c0 in ?? ()
#12 0x00007f6ca17048ca in ?? () from /lib64/ld-linux-x86-64.so.2
Backtrace stopped: previous frame inner to this frame (corrupt stack?)
Thread 120 (LWP 9815):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffc237e2ed0 in ?? ()
#5 0x00007f6c58cc9670 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 9814):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 118 (LWP 9813):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00005573603b9b70 in ?? ()
#5 0x00007f6c59ccb680 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 9812):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00005573606be690 in ?? ()
#5 0x00007f6c5a4cc550 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 9811):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000007 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000557360690ebc in ?? ()
#4 0x00007f6c5accd5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f6c5accd5f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000557360690ea8 in ?? ()
#9 0x00007f6ca01e7770 in ?? ()
#10 0x00007f6c5accd5f0 in ?? ()
#11 0x00007f6c5accd650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 115 (LWP 9810):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000557360690e3c in ?? ()
#4 0x00007f6c5b4ce5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f6c5b4ce5f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000557360690e28 in ?? ()
#9 0x00007f6ca01e7770 in ?? ()
#10 0x00007f6c5b4ce5f0 in ?? ()
#11 0x00007f6c5b4ce650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 114 (LWP 9809):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 9808):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 9807):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 9806):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 9805):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 9804):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 9803):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 9802):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 9801):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 9800):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 9799):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 9798):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 9797):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 9796):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 9795):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 9794):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 9793):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 9792):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 9791):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 9790):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 9789):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 9788):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 9787):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 9786):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 9785):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 9784):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 9783):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 9782):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 9781):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 9780):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 9779):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 9778):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 9777):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 9776):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 9775):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 9774):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 9773):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 9772):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 9771):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x00005573606838b8 in ?? ()
#4 0x00007f6c6ecf55d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f6c6ecf55f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 75 (LWP 9770):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 74 (LWP 9769):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 9768):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 9767):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 9766):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 9765):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 9764):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 9763):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 9762):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 9761):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 9760):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 9759):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 9758):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 9757):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 9756):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 9755):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 9754):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 9753):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 9752):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 9751):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000557360682db8 in ?? ()
#4 0x00007f6c78d095d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f6c78d095f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 55 (LWP 9750):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 9749):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 9748):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 9747):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 9746):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 9745):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 9744):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 9743):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 9742):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 9741):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 9740):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 9739):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 9738):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 9737):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 9736):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 9735):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 9734):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 9733):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 9732):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 36 (LWP 9731):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x00005573606822b8 in ?? ()
#4 0x00007f6c82d1d5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f6c82d1d5f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 35 (LWP 9730):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 34 (LWP 9729):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 33 (LWP 9728):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 32 (LWP 9727):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 31 (LWP 9726):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 30 (LWP 9725):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 9724):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 9723):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 9722):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 9721):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 9720):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 9719):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 9718):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 9717):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 9716):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 9715):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 9714):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 9713):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 9712):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 9711):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 15 (LWP 9710):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055736039f6c8 in ?? ()
#5 0x00007f6c8d5326a0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 9709):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 13 (LWP 9708):
#0 0x00007f6ca01e7ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 9707):
#0 0x00007f6c9e338a47 in ?? ()
#1 0x00007f6c8ed35680 in ?? ()
#2 0x00007f6c99644571 in ?? ()
#3 0x00000000000000ca in ?? ()
#4 0x000055736049ce58 in ?? ()
#5 0x00007f6c8ed356c0 in ?? ()
#6 0x00007f6c8ed35840 in ?? ()
#7 0x0000557360577030 in ?? ()
#8 0x00007f6c9964625d in ?? ()
#9 0x3fb96ab179b73000 in ?? ()
#10 0x0000557360489180 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000557360489180 in ?? ()
#13 0x000000006049ce58 in ?? ()
#14 0x0000557300000000 in ?? ()
#15 0x41da2dd847e1f7c9 in ?? ()
#16 0x0000557360577030 in ?? ()
#17 0x00007f6c8ed35720 in ?? ()
#18 0x00007f6c9964aba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb96ab179b73000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 9706):
#0 0x00007f6c9e338a47 in ?? ()
#1 0x00007f6c8f536680 in ?? ()
#2 0x00007f6c99644571 in ?? ()
#3 0x00000000000000ca in ?? ()
#4 0x000055736049da98 in ?? ()
#5 0x00007f6c8f5366c0 in ?? ()
#6 0x00007f6c8f536840 in ?? ()
#7 0x0000557360577030 in ?? ()
#8 0x00007f6c9964625d in ?? ()
#9 0x3fb98a0809c47000 in ?? ()
#10 0x0000557360488c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000557360488c00 in ?? ()
#13 0x000000006049da98 in ?? ()
#14 0x0000557300000000 in ?? ()
#15 0x41da2dd847e1f7cc in ?? ()
#16 0x0000557360577030 in ?? ()
#17 0x00007f6c8f536720 in ?? ()
#18 0x00007f6c9964aba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb98a0809c47000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 9705):
#0 0x00007f6c9e338a47 in ?? ()
#1 0x00007f6c8fd37680 in ?? ()
#2 0x00007f6c99644571 in ?? ()
#3 0x00000000000000ca in ?? ()
#4 0x000055736049dc58 in ?? ()
#5 0x00007f6c8fd376c0 in ?? ()
#6 0x00007f6c8fd37840 in ?? ()
#7 0x0000557360577030 in ?? ()
#8 0x00007f6c9964625d in ?? ()
#9 0x3fb967e9f4271000 in ?? ()
#10 0x0000557360487b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000557360487b80 in ?? ()
#13 0x000000006049dc58 in ?? ()
#14 0x0000557300000000 in ?? ()
#15 0x41da2dd847e1f7cc in ?? ()
#16 0x0000557360577030 in ?? ()
#17 0x00007f6c8fd37720 in ?? ()
#18 0x00007f6c9964aba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 9 (LWP 9704):
#0 0x00007f6c9e338a47 in ?? ()
#1 0x00007f6c91922680 in ?? ()
#2 0x00007f6c99644571 in ?? ()
#3 0x00000000000000ca in ?? ()
#4 0x000055736049de18 in ?? ()
#5 0x00007f6c919226c0 in ?? ()
#6 0x00007f6c91922840 in ?? ()
#7 0x0000557360577030 in ?? ()
#8 0x00007f6c9964625d in ?? ()
#9 0x3fb96cd71d23a000 in ?? ()
#10 0x0000557360488100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000557360488100 in ?? ()
#13 0x000000006049de18 in ?? ()
#14 0x0000557300000000 in ?? ()
#15 0x41da2dd847e1f7ca in ?? ()
#16 0x0000557360577030 in ?? ()
#17 0x00007f6c91922720 in ?? ()
#18 0x00007f6c9964aba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 9701):
#0 0x00007f6c9e32bcb9 in ?? ()
#1 0x00007f6c931259c0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 9700):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 9699):
#0 0x00007f6ca01eb9e2 in ?? ()
#1 0x00005573603b9ee0 in ?? ()
#2 0x00007f6c921234d0 in ?? ()
#3 0x00007f6c92123450 in ?? ()
#4 0x00007f6c92123570 in ?? ()
#5 0x00007f6c92123790 in ?? ()
#6 0x00007f6c921237a0 in ?? ()
#7 0x00007f6c921234e0 in ?? ()
#8 0x00007f6c921234d0 in ?? ()
#9 0x00005573603b8350 in ?? ()
#10 0x00007f6ca05d6c6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 9693):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000029 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055736053edc8 in ?? ()
#5 0x00007f6c94127430 in ?? ()
#6 0x0000000000000052 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 9692):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055736039e848 in ?? ()
#5 0x00007f6c94928790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 9691):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055736039e2a8 in ?? ()
#5 0x00007f6c95129790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 9690):
#0 0x00007f6ca01e7fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055736039e188 in ?? ()
#5 0x00007f6c9592a790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 9689):
#0 0x00007f6ca01ebd50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250902 21:33:03.477288 4307 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID b41910bc980f4cfdbcc6eb23e1084325 and pid 9554
************************ BEGIN STACKS **************************
[New LWP 9555]
[New LWP 9556]
[New LWP 9557]
[New LWP 9558]
[New LWP 9564]
[New LWP 9566]
[New LWP 9567]
[New LWP 9570]
[New LWP 9571]
[New LWP 9572]
[New LWP 9573]
[New LWP 9574]
[New LWP 9575]
[New LWP 9577]
[New LWP 9578]
[New LWP 9579]
[New LWP 9580]
[New LWP 9581]
[New LWP 9582]
[New LWP 9583]
[New LWP 9584]
[New LWP 9585]
[New LWP 9586]
[New LWP 9587]
[New LWP 9588]
[New LWP 9589]
[New LWP 9590]
[New LWP 9591]
[New LWP 9592]
[New LWP 9593]
[New LWP 9594]
[New LWP 9595]
[New LWP 9596]
[New LWP 9597]
[New LWP 9598]
[New LWP 9599]
[New LWP 9600]
[New LWP 9601]
[New LWP 9602]
[New LWP 9603]
[New LWP 9604]
[New LWP 9605]
[New LWP 9606]
[New LWP 9607]
[New LWP 9608]
[New LWP 9609]
[New LWP 9610]
[New LWP 9611]
[New LWP 9612]
[New LWP 9613]
[New LWP 9614]
[New LWP 9615]
[New LWP 9616]
[New LWP 9617]
[New LWP 9618]
[New LWP 9619]
[New LWP 9620]
[New LWP 9621]
[New LWP 9622]
[New LWP 9623]
[New LWP 9624]
[New LWP 9625]
[New LWP 9626]
[New LWP 9627]
[New LWP 9628]
[New LWP 9629]
[New LWP 9630]
[New LWP 9631]
[New LWP 9632]
[New LWP 9633]
[New LWP 9634]
[New LWP 9635]
[New LWP 9636]
[New LWP 9637]
[New LWP 9638]
[New LWP 9639]
[New LWP 9640]
[New LWP 9641]
[New LWP 9642]
[New LWP 9643]
[New LWP 9644]
[New LWP 9645]
[New LWP 9646]
[New LWP 9647]
[New LWP 9648]
[New LWP 9649]
[New LWP 9650]
[New LWP 9651]
[New LWP 9652]
[New LWP 9653]
[New LWP 9654]
[New LWP 9655]
[New LWP 9656]
[New LWP 9657]
[New LWP 9658]
[New LWP 9659]
[New LWP 9660]
[New LWP 9661]
[New LWP 9662]
[New LWP 9663]
[New LWP 9664]
[New LWP 9665]
[New LWP 9666]
[New LWP 9667]
[New LWP 9668]
[New LWP 9669]
[New LWP 9670]
[New LWP 9671]
[New LWP 9672]
[New LWP 9673]
[New LWP 9674]
[New LWP 9675]
[New LWP 9676]
[New LWP 9677]
[New LWP 9678]
[New LWP 9679]
[New LWP 9680]
[New LWP 9681]
[New LWP 9682]
[New LWP 9683]
[New LWP 9684]
[New LWP 9685]
[New LWP 10262]
0x00007f2cf6a00d50 in ?? ()
Id Target Id Frame
* 1 LWP 9554 "kudu" 0x00007f2cf6a00d50 in ?? ()
2 LWP 9555 "kudu" 0x00007f2cf69fcfb9 in ?? ()
3 LWP 9556 "kudu" 0x00007f2cf69fcfb9 in ?? ()
4 LWP 9557 "kudu" 0x00007f2cf69fcfb9 in ?? ()
5 LWP 9558 "kernel-watcher-" 0x00007f2cf69fcfb9 in ?? ()
6 LWP 9564 "ntp client-9564" 0x00007f2cf6a009e2 in ?? ()
7 LWP 9566 "file cache-evic" 0x00007f2cf69fcfb9 in ?? ()
8 LWP 9567 "sq_acceptor" 0x00007f2cf4b40cb9 in ?? ()
9 LWP 9570 "rpc reactor-957" 0x00007f2cf4b4da47 in ?? ()
10 LWP 9571 "rpc reactor-957" 0x00007f2cf4b4da47 in ?? ()
11 LWP 9572 "rpc reactor-957" 0x00007f2cf4b4da47 in ?? ()
12 LWP 9573 "rpc reactor-957" 0x00007f2cf4b4da47 in ?? ()
13 LWP 9574 "MaintenanceMgr " 0x00007f2cf69fcad3 in ?? ()
14 LWP 9575 "txn-status-mana" 0x00007f2cf69fcfb9 in ?? ()
15 LWP 9577 "collect_and_rem" 0x00007f2cf69fcfb9 in ?? ()
16 LWP 9578 "tc-session-exp-" 0x00007f2cf69fcfb9 in ?? ()
17 LWP 9579 "rpc worker-9579" 0x00007f2cf69fcad3 in ?? ()
18 LWP 9580 "rpc worker-9580" 0x00007f2cf69fcad3 in ?? ()
19 LWP 9581 "rpc worker-9581" 0x00007f2cf69fcad3 in ?? ()
20 LWP 9582 "rpc worker-9582" 0x00007f2cf69fcad3 in ?? ()
21 LWP 9583 "rpc worker-9583" 0x00007f2cf69fcad3 in ?? ()
22 LWP 9584 "rpc worker-9584" 0x00007f2cf69fcad3 in ?? ()
23 LWP 9585 "rpc worker-9585" 0x00007f2cf69fcad3 in ?? ()
24 LWP 9586 "rpc worker-9586" 0x00007f2cf69fcad3 in ?? ()
25 LWP 9587 "rpc worker-9587" 0x00007f2cf69fcad3 in ?? ()
26 LWP 9588 "rpc worker-9588" 0x00007f2cf69fcad3 in ?? ()
27 LWP 9589 "rpc worker-9589" 0x00007f2cf69fcad3 in ?? ()
28 LWP 9590 "rpc worker-9590" 0x00007f2cf69fcad3 in ?? ()
29 LWP 9591 "rpc worker-9591" 0x00007f2cf69fcad3 in ?? ()
30 LWP 9592 "rpc worker-9592" 0x00007f2cf69fcad3 in ?? ()
31 LWP 9593 "rpc worker-9593" 0x00007f2cf69fcad3 in ?? ()
32 LWP 9594 "rpc worker-9594" 0x00007f2cf69fcad3 in ?? ()
33 LWP 9595 "rpc worker-9595" 0x00007f2cf69fcad3 in ?? ()
34 LWP 9596 "rpc worker-9596" 0x00007f2cf69fcad3 in ?? ()
35 LWP 9597 "rpc worker-9597" 0x00007f2cf69fcad3 in ?? ()
36 LWP 9598 "rpc worker-9598" 0x00007f2cf69fcad3 in ?? ()
37 LWP 9599 "rpc worker-9599" 0x00007f2cf69fcad3 in ?? ()
38 LWP 9600 "rpc worker-9600" 0x00007f2cf69fcad3 in ?? ()
39 LWP 9601 "rpc worker-9601" 0x00007f2cf69fcad3 in ?? ()
40 LWP 9602 "rpc worker-9602" 0x00007f2cf69fcad3 in ?? ()
41 LWP 9603 "rpc worker-9603" 0x00007f2cf69fcad3 in ?? ()
42 LWP 9604 "rpc worker-9604" 0x00007f2cf69fcad3 in ?? ()
43 LWP 9605 "rpc worker-9605" 0x00007f2cf69fcad3 in ?? ()
44 LWP 9606 "rpc worker-9606" 0x00007f2cf69fcad3 in ?? ()
45 LWP 9607 "rpc worker-9607" 0x00007f2cf69fcad3 in ?? ()
46 LWP 9608 "rpc worker-9608" 0x00007f2cf69fcad3 in ?? ()
47 LWP 9609 "rpc worker-9609" 0x00007f2cf69fcad3 in ?? ()
48 LWP 9610 "rpc worker-9610" 0x00007f2cf69fcad3 in ?? ()
49 LWP 9611 "rpc worker-9611" 0x00007f2cf69fcad3 in ?? ()
50 LWP 9612 "rpc worker-9612" 0x00007f2cf69fcad3 in ?? ()
51 LWP 9613 "rpc worker-9613" 0x00007f2cf69fcad3 in ?? ()
52 LWP 9614 "rpc worker-9614" 0x00007f2cf69fcad3 in ?? ()
53 LWP 9615 "rpc worker-9615" 0x00007f2cf69fcad3 in ?? ()
54 LWP 9616 "rpc worker-9616" 0x00007f2cf69fcad3 in ?? ()
55 LWP 9617 "rpc worker-9617" 0x00007f2cf69fcad3 in ?? ()
56 LWP 9618 "rpc worker-9618" 0x00007f2cf69fcad3 in ?? ()
57 LWP 9619 "rpc worker-9619" 0x00007f2cf69fcad3 in ?? ()
58 LWP 9620 "rpc worker-9620" 0x00007f2cf69fcad3 in ?? ()
59 LWP 9621 "rpc worker-9621" 0x00007f2cf69fcad3 in ?? ()
60 LWP 9622 "rpc worker-9622" 0x00007f2cf69fcad3 in ?? ()
61 LWP 9623 "rpc worker-9623" 0x00007f2cf69fcad3 in ?? ()
62 LWP 9624 "rpc worker-9624" 0x00007f2cf69fcad3 in ?? ()
63 LWP 9625 "rpc worker-9625" 0x00007f2cf69fcad3 in ?? ()
64 LWP 9626 "rpc worker-9626" 0x00007f2cf69fcad3 in ?? ()
65 LWP 9627 "rpc worker-9627" 0x00007f2cf69fcad3 in ?? ()
66 LWP 9628 "rpc worker-9628" 0x00007f2cf69fcad3 in ?? ()
67 LWP 9629 "rpc worker-9629" 0x00007f2cf69fcad3 in ?? ()
68 LWP 9630 "rpc worker-9630" 0x00007f2cf69fcad3 in ?? ()
69 LWP 9631 "rpc worker-9631" 0x00007f2cf69fcad3 in ?? ()
70 LWP 9632 "rpc worker-9632" 0x00007f2cf69fcad3 in ?? ()
71 LWP 9633 "rpc worker-9633" 0x00007f2cf69fcad3 in ?? ()
72 LWP 9634 "rpc worker-9634" 0x00007f2cf69fcad3 in ?? ()
73 LWP 9635 "rpc worker-9635" 0x00007f2cf69fcad3 in ?? ()
74 LWP 9636 "rpc worker-9636" 0x00007f2cf69fcad3 in ?? ()
75 LWP 9637 "rpc worker-9637" 0x00007f2cf69fcad3 in ?? ()
76 LWP 9638 "rpc worker-9638" 0x00007f2cf69fcad3 in ?? ()
77 LWP 9639 "rpc worker-9639" 0x00007f2cf69fcad3 in ?? ()
78 LWP 9640 "rpc worker-9640" 0x00007f2cf69fcad3 in ?? ()
79 LWP 9641 "rpc worker-9641" 0x00007f2cf69fcad3 in ?? ()
80 LWP 9642 "rpc worker-9642" 0x00007f2cf69fcad3 in ?? ()
81 LWP 9643 "rpc worker-9643" 0x00007f2cf69fcad3 in ?? ()
82 LWP 9644 "rpc worker-9644" 0x00007f2cf69fcad3 in ?? ()
83 LWP 9645 "rpc worker-9645" 0x00007f2cf69fcad3 in ?? ()
84 LWP 9646 "rpc worker-9646" 0x00007f2cf69fcad3 in ?? ()
85 LWP 9647 "rpc worker-9647" 0x00007f2cf69fcad3 in ?? ()
86 LWP 9648 "rpc worker-9648" 0x00007f2cf69fcad3 in ?? ()
87 LWP 9649 "rpc worker-9649" 0x00007f2cf69fcad3 in ?? ()
88 LWP 9650 "rpc worker-9650" 0x00007f2cf69fcad3 in ?? ()
89 LWP 9651 "rpc worker-9651" 0x00007f2cf69fcad3 in ?? ()
90 LWP 9652 "rpc worker-9652" 0x00007f2cf69fcad3 in ?? ()
91 LWP 9653 "rpc worker-9653" 0x00007f2cf69fcad3 in ?? ()
92 LWP 9654 "rpc worker-9654" 0x00007f2cf69fcad3 in ?? ()
93 LWP 9655 "rpc worker-9655" 0x00007f2cf69fcad3 in ?? ()
94 LWP 9656 "rpc worker-9656" 0x00007f2cf69fcad3 in ?? ()
95 LWP 9657 "rpc worker-9657" 0x00007f2cf69fcad3 in ?? ()
96 LWP 9658 "rpc worker-9658" 0x00007f2cf69fcad3 in ?? ()
97 LWP 9659 "rpc worker-9659" 0x00007f2cf69fcad3 in ?? ()
98 LWP 9660 "rpc worker-9660" 0x00007f2cf69fcad3 in ?? ()
99 LWP 9661 "rpc worker-9661" 0x00007f2cf69fcad3 in ?? ()
100 LWP 9662 "rpc worker-9662" 0x00007f2cf69fcad3 in ?? ()
101 LWP 9663 "rpc worker-9663" 0x00007f2cf69fcad3 in ?? ()
102 LWP 9664 "rpc worker-9664" 0x00007f2cf69fcad3 in ?? ()
103 LWP 9665 "rpc worker-9665" 0x00007f2cf69fcad3 in ?? ()
104 LWP 9666 "rpc worker-9666" 0x00007f2cf69fcad3 in ?? ()
105 LWP 9667 "rpc worker-9667" 0x00007f2cf69fcad3 in ?? ()
106 LWP 9668 "rpc worker-9668" 0x00007f2cf69fcad3 in ?? ()
107 LWP 9669 "rpc worker-9669" 0x00007f2cf69fcad3 in ?? ()
108 LWP 9670 "rpc worker-9670" 0x00007f2cf69fcad3 in ?? ()
109 LWP 9671 "rpc worker-9671" 0x00007f2cf69fcad3 in ?? ()
110 LWP 9672 "rpc worker-9672" 0x00007f2cf69fcad3 in ?? ()
111 LWP 9673 "rpc worker-9673" 0x00007f2cf69fcad3 in ?? ()
112 LWP 9674 "rpc worker-9674" 0x00007f2cf69fcad3 in ?? ()
113 LWP 9675 "rpc worker-9675" 0x00007f2cf69fcad3 in ?? ()
114 LWP 9676 "rpc worker-9676" 0x00007f2cf69fcad3 in ?? ()
115 LWP 9677 "rpc worker-9677" 0x00007f2cf69fcad3 in ?? ()
116 LWP 9678 "rpc worker-9678" 0x00007f2cf69fcad3 in ?? ()
117 LWP 9679 "diag-logger-967" 0x00007f2cf69fcfb9 in ?? ()
118 LWP 9680 "result-tracker-" 0x00007f2cf69fcfb9 in ?? ()
119 LWP 9681 "excess-log-dele" 0x00007f2cf69fcfb9 in ?? ()
120 LWP 9682 "tcmalloc-memory" 0x00007f2cf69fcfb9 in ?? ()
121 LWP 9683 "acceptor-9683" 0x00007f2cf4b4f0c7 in ?? ()
122 LWP 9684 "heartbeat-9684" 0x00007f2cf69fcfb9 in ?? ()
123 LWP 9685 "maintenance_sch" 0x00007f2cf69fcfb9 in ?? ()
124 LWP 10262 "raft [worker]-1" 0x00007f2cf69fcfb9 in ?? ()
Thread 124 (LWP 10262):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000001b in ?? ()
#3 0x0000000100000081 in ?? ()
#4 0x00007f2ca9cd3764 in ?? ()
#5 0x00007f2ca9cd3510 in ?? ()
#6 0x0000000000000037 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x00007f2ca9cd3530 in ?? ()
#9 0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007f2ca9cd3590 in ?? ()
#12 0x00007f2cf669ea11 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 123 (LWP 9685):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000023 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560e13345e50 in ?? ()
#5 0x00007f2cad4da470 in ?? ()
#6 0x0000000000000046 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 9684):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000000b in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560e132bf930 in ?? ()
#5 0x00007f2cadcdb3f0 in ?? ()
#6 0x0000000000000016 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 121 (LWP 9683):
#0 0x00007f2cf4b4f0c7 in ?? ()
#1 0x00007f2cae4dc010 in ?? ()
#2 0x00007f2cf66af2f2 in ?? ()
#3 0x00007f2cae4dc010 in ?? ()
#4 0x0000000000080800 in ?? ()
#5 0x00007f2cae4dc3d0 in ?? ()
#6 0x00007f2cae4dc080 in ?? ()
#7 0x0000560e1326ac48 in ?? ()
#8 0x00007f2cf66b4d89 in ?? ()
#9 0x00007f2cae4dc500 in ?? ()
#10 0x00007f2cae4dc700 in ?? ()
#11 0x00000080ae4dc3c0 in ?? ()
#12 0x00007f2cf7f198ca in ?? () from /lib64/ld-linux-x86-64.so.2
Backtrace stopped: previous frame inner to this frame (corrupt stack?)
Thread 120 (LWP 9682):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffe93e70e10 in ?? ()
#5 0x00007f2caecdd670 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 9681):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 118 (LWP 9680):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560e131f3b70 in ?? ()
#5 0x00007f2cafcdf680 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 9679):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560e1356b390 in ?? ()
#5 0x00007f2cb04e0550 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 9678):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000007 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560e1353f23c in ?? ()
#4 0x00007f2cb0ce15d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f2cb0ce15f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560e1353f228 in ?? ()
#9 0x00007f2cf69fc770 in ?? ()
#10 0x00007f2cb0ce15f0 in ?? ()
#11 0x00007f2cb0ce1650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 115 (LWP 9677):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560e1353f1bc in ?? ()
#4 0x00007f2cb14e25d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f2cb14e25f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560e1353f1a8 in ?? ()
#9 0x00007f2cf69fc770 in ?? ()
#10 0x00007f2cb14e25f0 in ?? ()
#11 0x00007f2cb14e2650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 114 (LWP 9676):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 9675):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 9674):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 9673):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 9672):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 9671):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 9670):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 9669):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 9668):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 9667):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 9666):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 9665):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 9664):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 9663):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 9662):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 9661):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 9660):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 9659):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 9658):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 9657):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 9656):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 9655):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 9654):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 9653):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 9652):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 9651):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 9650):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 9649):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 9648):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 9647):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 9646):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 9645):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 9644):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 9643):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 9642):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 9641):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 9640):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 9639):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 9638):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 75 (LWP 9637):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 74 (LWP 9636):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 9635):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 9634):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000004 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000560e1353f738 in ?? ()
#4 0x00007f2cc6d0d5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f2cc6d0d5f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 71 (LWP 9633):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 9632):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 9631):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 9630):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 9629):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 9628):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 9627):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 9626):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 9625):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 9624):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 9623):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 9622):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 9621):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 9620):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 9619):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 9618):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000560e1352f1b8 in ?? ()
#4 0x00007f2cced1d5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f2cced1d5f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 55 (LWP 9617):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 9616):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 9615):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 9614):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 9613):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 9612):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 9611):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 9610):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 9609):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 9608):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 9607):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 9606):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 9605):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 9604):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 9603):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 9602):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 9601):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 9600):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 9599):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 36 (LWP 9598):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000fde in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000560e1352e6b8 in ?? ()
#4 0x00007f2cd8d315d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f2cd8d315f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 35 (LWP 9597):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000001093 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560e1352e63c in ?? ()
#4 0x00007f2cd95325d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f2cd95325f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560e1352e628 in ?? ()
#9 0x00007f2cf69fc770 in ?? ()
#10 0x00007f2cd95325f0 in ?? ()
#11 0x00007f2cd9532650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 34 (LWP 9596):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x000000000000056d in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560e1352e5bc in ?? ()
#4 0x00007f2cd9d335d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f2cd9d335f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560e1352e5a8 in ?? ()
#9 0x00007f2cf69fc770 in ?? ()
#10 0x00007f2cd9d335f0 in ?? ()
#11 0x00007f2cd9d33650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 33 (LWP 9595):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x00000000000003e7 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560e1352e53c in ?? ()
#4 0x00007f2cda5345d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f2cda5345f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560e1352e528 in ?? ()
#9 0x00007f2cf69fc770 in ?? ()
#10 0x00007f2cda5345f0 in ?? ()
#11 0x00007f2cda534650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 32 (LWP 9594):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000001af8 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x0000560e1352e4b8 in ?? ()
#4 0x00007f2cdad355d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f2cdad355f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 31 (LWP 9593):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000001c7b in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x0000560e1352e43c in ?? ()
#4 0x00007f2cdb5365d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f2cdb5365f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000560e1352e428 in ?? ()
#9 0x00007f2cf69fc770 in ?? ()
#10 0x00007f2cdb5365f0 in ?? ()
#11 0x00007f2cdb536650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 30 (LWP 9592):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 9591):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 9590):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 9589):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 9588):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 9587):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 9586):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 9585):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 9584):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 9583):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 9582):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 9581):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 9580):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 9579):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 9578):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 15 (LWP 9577):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560e131d96c8 in ?? ()
#5 0x00007f2ce35466a0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 9575):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 13 (LWP 9574):
#0 0x00007f2cf69fcad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 9573):
#0 0x00007f2cf4b4da47 in ?? ()
#1 0x00007f2ce554a680 in ?? ()
#2 0x00007f2cefe59571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x0000560e132d6e58 in ?? ()
#5 0x00007f2ce554a6c0 in ?? ()
#6 0x00007f2ce554a840 in ?? ()
#7 0x0000560e133b1030 in ?? ()
#8 0x00007f2cefe5b25d in ?? ()
#9 0x3fb95e30d5eb9000 in ?? ()
#10 0x0000560e132c3180 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000560e132c3180 in ?? ()
#13 0x00000000132d6e58 in ?? ()
#14 0x0000560e00000000 in ?? ()
#15 0x41da2dd847e1f7cc in ?? ()
#16 0x0000560e133b1030 in ?? ()
#17 0x00007f2ce554a720 in ?? ()
#18 0x00007f2cefe5fba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb95e30d5eb9000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 9572):
#0 0x00007f2cf4b4da47 in ?? ()
#1 0x00007f2ce5d4b680 in ?? ()
#2 0x00007f2cefe59571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x0000560e132d7a98 in ?? ()
#5 0x00007f2ce5d4b6c0 in ?? ()
#6 0x00007f2ce5d4b840 in ?? ()
#7 0x0000560e133b1030 in ?? ()
#8 0x00007f2cefe5b25d in ?? ()
#9 0x3fa1b36a69f48000 in ?? ()
#10 0x0000560e132c2c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000560e132c2c00 in ?? ()
#13 0x00000000132d7a98 in ?? ()
#14 0x0000560e00000000 in ?? ()
#15 0x41da2dd847e1f7ca in ?? ()
#16 0x0000560e133b1030 in ?? ()
#17 0x00007f2ce5d4b720 in ?? ()
#18 0x00007f2cefe5fba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa1b36a69f48000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 9571):
#0 0x00007f2cf4b4da47 in ?? ()
#1 0x00007f2ce654c680 in ?? ()
#2 0x00007f2cefe59571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x0000560e132d7c58 in ?? ()
#5 0x00007f2ce654c6c0 in ?? ()
#6 0x00007f2ce654c840 in ?? ()
#7 0x0000560e133b1030 in ?? ()
#8 0x00007f2cefe5b25d in ?? ()
#9 0x3fa139633cd98000 in ?? ()
#10 0x0000560e132c2100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000560e132c2100 in ?? ()
#13 0x00000000132d7c58 in ?? ()
#14 0x0000560e00000000 in ?? ()
#15 0x41da2dd847e1f7cc in ?? ()
#16 0x0000560e133b1030 in ?? ()
#17 0x00007f2ce654c720 in ?? ()
#18 0x00007f2cefe5fba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa139633cd98000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 9 (LWP 9570):
#0 0x00007f2cf4b4da47 in ?? ()
#1 0x00007f2ce8137680 in ?? ()
#2 0x00007f2cefe59571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x0000560e132d7e18 in ?? ()
#5 0x00007f2ce81376c0 in ?? ()
#6 0x00007f2ce8137840 in ?? ()
#7 0x0000560e133b1030 in ?? ()
#8 0x00007f2cefe5b25d in ?? ()
#9 0x3fa0f47224b18000 in ?? ()
#10 0x0000560e132c1600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000560e132c1600 in ?? ()
#13 0x00000000132d7e18 in ?? ()
#14 0x0000560e00000000 in ?? ()
#15 0x41da2dd847e1f7cd in ?? ()
#16 0x0000560e133b1030 in ?? ()
#17 0x00007f2ce8137720 in ?? ()
#18 0x00007f2cefe5fba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 9567):
#0 0x00007f2cf4b40cb9 in ?? ()
#1 0x00007f2ce993a9c0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 9566):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 9564):
#0 0x00007f2cf6a009e2 in ?? ()
#1 0x0000560e131f3ee0 in ?? ()
#2 0x00007f2ce89384d0 in ?? ()
#3 0x00007f2ce8938450 in ?? ()
#4 0x00007f2ce8938570 in ?? ()
#5 0x00007f2ce8938790 in ?? ()
#6 0x00007f2ce89387a0 in ?? ()
#7 0x00007f2ce89384e0 in ?? ()
#8 0x00007f2ce89384d0 in ?? ()
#9 0x0000560e131f2350 in ?? ()
#10 0x00007f2cf6debc6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 9558):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000002c in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560e13378dc8 in ?? ()
#5 0x00007f2cea93c430 in ?? ()
#6 0x0000000000000058 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 9557):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560e131d8848 in ?? ()
#5 0x00007f2ceb13d790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 9556):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560e131d82a8 in ?? ()
#5 0x00007f2ceb93e790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 9555):
#0 0x00007f2cf69fcfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x0000560e131d8188 in ?? ()
#5 0x00007f2cec13f790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 9554):
#0 0x00007f2cf6a00d50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250902 21:33:03.967355 4307 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID f79e0a34bf4d4181aed51bd155010a25 and pid 9953
************************ BEGIN STACKS **************************
[New LWP 9956]
[New LWP 9957]
[New LWP 9958]
[New LWP 9959]
[New LWP 9965]
[New LWP 9966]
[New LWP 9967]
[New LWP 9970]
[New LWP 9971]
[New LWP 9972]
[New LWP 9973]
[New LWP 9974]
[New LWP 9975]
[New LWP 9977]
[New LWP 9978]
[New LWP 9979]
[New LWP 9980]
[New LWP 9981]
[New LWP 9982]
[New LWP 9983]
[New LWP 9984]
[New LWP 9985]
[New LWP 9986]
[New LWP 9987]
[New LWP 9988]
[New LWP 9989]
[New LWP 9990]
[New LWP 9991]
[New LWP 9992]
[New LWP 9993]
[New LWP 9994]
[New LWP 9995]
[New LWP 9996]
[New LWP 9997]
[New LWP 9998]
[New LWP 9999]
[New LWP 10000]
[New LWP 10001]
[New LWP 10002]
[New LWP 10003]
[New LWP 10004]
[New LWP 10005]
[New LWP 10006]
[New LWP 10007]
[New LWP 10008]
[New LWP 10009]
[New LWP 10010]
[New LWP 10011]
[New LWP 10012]
[New LWP 10013]
[New LWP 10014]
[New LWP 10015]
[New LWP 10016]
[New LWP 10017]
[New LWP 10018]
[New LWP 10019]
[New LWP 10020]
[New LWP 10021]
[New LWP 10022]
[New LWP 10023]
[New LWP 10024]
[New LWP 10025]
[New LWP 10026]
[New LWP 10027]
[New LWP 10028]
[New LWP 10029]
[New LWP 10030]
[New LWP 10031]
[New LWP 10032]
[New LWP 10033]
[New LWP 10034]
[New LWP 10035]
[New LWP 10036]
[New LWP 10037]
[New LWP 10038]
[New LWP 10039]
[New LWP 10040]
[New LWP 10041]
[New LWP 10042]
[New LWP 10043]
[New LWP 10044]
[New LWP 10045]
[New LWP 10046]
[New LWP 10047]
[New LWP 10048]
[New LWP 10049]
[New LWP 10050]
[New LWP 10051]
[New LWP 10052]
[New LWP 10053]
[New LWP 10054]
[New LWP 10055]
[New LWP 10056]
[New LWP 10057]
[New LWP 10058]
[New LWP 10059]
[New LWP 10060]
[New LWP 10061]
[New LWP 10062]
[New LWP 10063]
[New LWP 10064]
[New LWP 10065]
[New LWP 10066]
[New LWP 10067]
[New LWP 10068]
[New LWP 10069]
[New LWP 10070]
[New LWP 10071]
[New LWP 10072]
[New LWP 10073]
[New LWP 10074]
[New LWP 10075]
[New LWP 10076]
[New LWP 10077]
[New LWP 10078]
[New LWP 10079]
[New LWP 10080]
[New LWP 10081]
[New LWP 10082]
[New LWP 10083]
[New LWP 10084]
[New LWP 10085]
0x00007fc3260d1d50 in ?? ()
Id Target Id Frame
* 1 LWP 9953 "kudu" 0x00007fc3260d1d50 in ?? ()
2 LWP 9956 "kudu" 0x00007fc3260cdfb9 in ?? ()
3 LWP 9957 "kudu" 0x00007fc3260cdfb9 in ?? ()
4 LWP 9958 "kudu" 0x00007fc3260cdfb9 in ?? ()
5 LWP 9959 "kernel-watcher-" 0x00007fc3260cdfb9 in ?? ()
6 LWP 9965 "ntp client-9965" 0x00007fc3260d19e2 in ?? ()
7 LWP 9966 "file cache-evic" 0x00007fc3260cdfb9 in ?? ()
8 LWP 9967 "sq_acceptor" 0x00007fc324211cb9 in ?? ()
9 LWP 9970 "rpc reactor-997" 0x00007fc32421ea47 in ?? ()
10 LWP 9971 "rpc reactor-997" 0x00007fc32421ea47 in ?? ()
11 LWP 9972 "rpc reactor-997" 0x00007fc32421ea47 in ?? ()
12 LWP 9973 "rpc reactor-997" 0x00007fc32421ea47 in ?? ()
13 LWP 9974 "MaintenanceMgr " 0x00007fc3260cdad3 in ?? ()
14 LWP 9975 "txn-status-mana" 0x00007fc3260cdfb9 in ?? ()
15 LWP 9977 "collect_and_rem" 0x00007fc3260cdfb9 in ?? ()
16 LWP 9978 "tc-session-exp-" 0x00007fc3260cdfb9 in ?? ()
17 LWP 9979 "rpc worker-9979" 0x00007fc3260cdad3 in ?? ()
18 LWP 9980 "rpc worker-9980" 0x00007fc3260cdad3 in ?? ()
19 LWP 9981 "rpc worker-9981" 0x00007fc3260cdad3 in ?? ()
20 LWP 9982 "rpc worker-9982" 0x00007fc3260cdad3 in ?? ()
21 LWP 9983 "rpc worker-9983" 0x00007fc3260cdad3 in ?? ()
22 LWP 9984 "rpc worker-9984" 0x00007fc3260cdad3 in ?? ()
23 LWP 9985 "rpc worker-9985" 0x00007fc3260cdad3 in ?? ()
24 LWP 9986 "rpc worker-9986" 0x00007fc3260cdad3 in ?? ()
25 LWP 9987 "rpc worker-9987" 0x00007fc3260cdad3 in ?? ()
26 LWP 9988 "rpc worker-9988" 0x00007fc3260cdad3 in ?? ()
27 LWP 9989 "rpc worker-9989" 0x00007fc3260cdad3 in ?? ()
28 LWP 9990 "rpc worker-9990" 0x00007fc3260cdad3 in ?? ()
29 LWP 9991 "rpc worker-9991" 0x00007fc3260cdad3 in ?? ()
30 LWP 9992 "rpc worker-9992" 0x00007fc3260cdad3 in ?? ()
31 LWP 9993 "rpc worker-9993" 0x00007fc3260cdad3 in ?? ()
32 LWP 9994 "rpc worker-9994" 0x00007fc3260cdad3 in ?? ()
33 LWP 9995 "rpc worker-9995" 0x00007fc3260cdad3 in ?? ()
34 LWP 9996 "rpc worker-9996" 0x00007fc3260cdad3 in ?? ()
35 LWP 9997 "rpc worker-9997" 0x00007fc3260cdad3 in ?? ()
36 LWP 9998 "rpc worker-9998" 0x00007fc3260cdad3 in ?? ()
37 LWP 9999 "rpc worker-9999" 0x00007fc3260cdad3 in ?? ()
38 LWP 10000 "rpc worker-1000" 0x00007fc3260cdad3 in ?? ()
39 LWP 10001 "rpc worker-1000" 0x00007fc3260cdad3 in ?? ()
40 LWP 10002 "rpc worker-1000" 0x00007fc3260cdad3 in ?? ()
41 LWP 10003 "rpc worker-1000" 0x00007fc3260cdad3 in ?? ()
42 LWP 10004 "rpc worker-1000" 0x00007fc3260cdad3 in ?? ()
43 LWP 10005 "rpc worker-1000" 0x00007fc3260cdad3 in ?? ()
44 LWP 10006 "rpc worker-1000" 0x00007fc3260cdad3 in ?? ()
45 LWP 10007 "rpc worker-1000" 0x00007fc3260cdad3 in ?? ()
46 LWP 10008 "rpc worker-1000" 0x00007fc3260cdad3 in ?? ()
47 LWP 10009 "rpc worker-1000" 0x00007fc3260cdad3 in ?? ()
48 LWP 10010 "rpc worker-1001" 0x00007fc3260cdad3 in ?? ()
49 LWP 10011 "rpc worker-1001" 0x00007fc3260cdad3 in ?? ()
50 LWP 10012 "rpc worker-1001" 0x00007fc3260cdad3 in ?? ()
51 LWP 10013 "rpc worker-1001" 0x00007fc3260cdad3 in ?? ()
52 LWP 10014 "rpc worker-1001" 0x00007fc3260cdad3 in ?? ()
53 LWP 10015 "rpc worker-1001" 0x00007fc3260cdad3 in ?? ()
54 LWP 10016 "rpc worker-1001" 0x00007fc3260cdad3 in ?? ()
55 LWP 10017 "rpc worker-1001" 0x00007fc3260cdad3 in ?? ()
56 LWP 10018 "rpc worker-1001" 0x00007fc3260cdad3 in ?? ()
57 LWP 10019 "rpc worker-1001" 0x00007fc3260cdad3 in ?? ()
58 LWP 10020 "rpc worker-1002" 0x00007fc3260cdad3 in ?? ()
59 LWP 10021 "rpc worker-1002" 0x00007fc3260cdad3 in ?? ()
60 LWP 10022 "rpc worker-1002" 0x00007fc3260cdad3 in ?? ()
61 LWP 10023 "rpc worker-1002" 0x00007fc3260cdad3 in ?? ()
62 LWP 10024 "rpc worker-1002" 0x00007fc3260cdad3 in ?? ()
63 LWP 10025 "rpc worker-1002" 0x00007fc3260cdad3 in ?? ()
64 LWP 10026 "rpc worker-1002" 0x00007fc3260cdad3 in ?? ()
65 LWP 10027 "rpc worker-1002" 0x00007fc3260cdad3 in ?? ()
66 LWP 10028 "rpc worker-1002" 0x00007fc3260cdad3 in ?? ()
67 LWP 10029 "rpc worker-1002" 0x00007fc3260cdad3 in ?? ()
68 LWP 10030 "rpc worker-1003" 0x00007fc3260cdad3 in ?? ()
69 LWP 10031 "rpc worker-1003" 0x00007fc3260cdad3 in ?? ()
70 LWP 10032 "rpc worker-1003" 0x00007fc3260cdad3 in ?? ()
71 LWP 10033 "rpc worker-1003" 0x00007fc3260cdad3 in ?? ()
72 LWP 10034 "rpc worker-1003" 0x00007fc3260cdad3 in ?? ()
73 LWP 10035 "rpc worker-1003" 0x00007fc3260cdad3 in ?? ()
74 LWP 10036 "rpc worker-1003" 0x00007fc3260cdad3 in ?? ()
75 LWP 10037 "rpc worker-1003" 0x00007fc3260cdad3 in ?? ()
76 LWP 10038 "rpc worker-1003" 0x00007fc3260cdad3 in ?? ()
77 LWP 10039 "rpc worker-1003" 0x00007fc3260cdad3 in ?? ()
78 LWP 10040 "rpc worker-1004" 0x00007fc3260cdad3 in ?? ()
79 LWP 10041 "rpc worker-1004" 0x00007fc3260cdad3 in ?? ()
80 LWP 10042 "rpc worker-1004" 0x00007fc3260cdad3 in ?? ()
81 LWP 10043 "rpc worker-1004" 0x00007fc3260cdad3 in ?? ()
82 LWP 10044 "rpc worker-1004" 0x00007fc3260cdad3 in ?? ()
83 LWP 10045 "rpc worker-1004" 0x00007fc3260cdad3 in ?? ()
84 LWP 10046 "rpc worker-1004" 0x00007fc3260cdad3 in ?? ()
85 LWP 10047 "rpc worker-1004" 0x00007fc3260cdad3 in ?? ()
86 LWP 10048 "rpc worker-1004" 0x00007fc3260cdad3 in ?? ()
87 LWP 10049 "rpc worker-1004" 0x00007fc3260cdad3 in ?? ()
88 LWP 10050 "rpc worker-1005" 0x00007fc3260cdad3 in ?? ()
89 LWP 10051 "rpc worker-1005" 0x00007fc3260cdad3 in ?? ()
90 LWP 10052 "rpc worker-1005" 0x00007fc3260cdad3 in ?? ()
91 LWP 10053 "rpc worker-1005" 0x00007fc3260cdad3 in ?? ()
92 LWP 10054 "rpc worker-1005" 0x00007fc3260cdad3 in ?? ()
93 LWP 10055 "rpc worker-1005" 0x00007fc3260cdad3 in ?? ()
94 LWP 10056 "rpc worker-1005" 0x00007fc3260cdad3 in ?? ()
95 LWP 10057 "rpc worker-1005" 0x00007fc3260cdad3 in ?? ()
96 LWP 10058 "rpc worker-1005" 0x00007fc3260cdad3 in ?? ()
97 LWP 10059 "rpc worker-1005" 0x00007fc3260cdad3 in ?? ()
98 LWP 10060 "rpc worker-1006" 0x00007fc3260cdad3 in ?? ()
99 LWP 10061 "rpc worker-1006" 0x00007fc3260cdad3 in ?? ()
100 LWP 10062 "rpc worker-1006" 0x00007fc3260cdad3 in ?? ()
101 LWP 10063 "rpc worker-1006" 0x00007fc3260cdad3 in ?? ()
102 LWP 10064 "rpc worker-1006" 0x00007fc3260cdad3 in ?? ()
103 LWP 10065 "rpc worker-1006" 0x00007fc3260cdad3 in ?? ()
104 LWP 10066 "rpc worker-1006" 0x00007fc3260cdad3 in ?? ()
105 LWP 10067 "rpc worker-1006" 0x00007fc3260cdad3 in ?? ()
106 LWP 10068 "rpc worker-1006" 0x00007fc3260cdad3 in ?? ()
107 LWP 10069 "rpc worker-1006" 0x00007fc3260cdad3 in ?? ()
108 LWP 10070 "rpc worker-1007" 0x00007fc3260cdad3 in ?? ()
109 LWP 10071 "rpc worker-1007" 0x00007fc3260cdad3 in ?? ()
110 LWP 10072 "rpc worker-1007" 0x00007fc3260cdad3 in ?? ()
111 LWP 10073 "rpc worker-1007" 0x00007fc3260cdad3 in ?? ()
112 LWP 10074 "rpc worker-1007" 0x00007fc3260cdad3 in ?? ()
113 LWP 10075 "rpc worker-1007" 0x00007fc3260cdad3 in ?? ()
114 LWP 10076 "rpc worker-1007" 0x00007fc3260cdad3 in ?? ()
115 LWP 10077 "rpc worker-1007" 0x00007fc3260cdad3 in ?? ()
116 LWP 10078 "rpc worker-1007" 0x00007fc3260cdad3 in ?? ()
117 LWP 10079 "diag-logger-100" 0x00007fc3260cdfb9 in ?? ()
118 LWP 10080 "result-tracker-" 0x00007fc3260cdfb9 in ?? ()
119 LWP 10081 "excess-log-dele" 0x00007fc3260cdfb9 in ?? ()
120 LWP 10082 "tcmalloc-memory" 0x00007fc3260cdfb9 in ?? ()
121 LWP 10083 "acceptor-10083" 0x00007fc3242200c7 in ?? ()
122 LWP 10084 "heartbeat-10084" 0x00007fc3260cdfb9 in ?? ()
123 LWP 10085 "maintenance_sch" 0x00007fc3260cdfb9 in ?? ()
Thread 123 (LWP 10085):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000023 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055ae0ab0fe50 in ?? ()
#5 0x00007fc2dc9a6470 in ?? ()
#6 0x0000000000000046 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 10084):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000009 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055ae0aa89930 in ?? ()
#5 0x00007fc2dd1a73f0 in ?? ()
#6 0x0000000000000012 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 121 (LWP 10083):
#0 0x00007fc3242200c7 in ?? ()
#1 0x00007fc2dd9a8010 in ?? ()
#2 0x00007fc325d802f2 in ?? ()
#3 0x00007fc2dd9a8010 in ?? ()
#4 0x0000000000080800 in ?? ()
#5 0x00007fc2dd9a83d0 in ?? ()
#6 0x00007fc2dd9a8080 in ?? ()
#7 0x000055ae0aa34c48 in ?? ()
#8 0x00007fc325d85d89 in ?? ()
#9 0x00007fc2dd9a8500 in ?? ()
#10 0x00007fc2dd9a8700 in ?? ()
#11 0x00000080dd9a83c0 in ?? ()
#12 0x00007fc3275ea8ca in ?? () from /lib64/ld-linux-x86-64.so.2
Backtrace stopped: previous frame inner to this frame (corrupt stack?)
Thread 120 (LWP 10082):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffc622f39b0 in ?? ()
#5 0x00007fc2de1a9670 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 10081):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 118 (LWP 10080):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055ae0a9bdb70 in ?? ()
#5 0x00007fc2df1ab680 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 10079):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000008 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055ae0ad2e790 in ?? ()
#5 0x00007fc2df9ac550 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 10078):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055ae0ae3b23c in ?? ()
#4 0x00007fc2e01ad5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2e01ad5f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055ae0ae3b228 in ?? ()
#9 0x00007fc3260cd770 in ?? ()
#10 0x00007fc2e01ad5f0 in ?? ()
#11 0x00007fc2e01ad650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 115 (LWP 10077):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000007 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055ae0ae3b2bc in ?? ()
#4 0x00007fc2e09ae5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2e09ae5f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055ae0ae3b2a8 in ?? ()
#9 0x00007fc3260cd770 in ?? ()
#10 0x00007fc2e09ae5f0 in ?? ()
#11 0x00007fc2e09ae650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 114 (LWP 10076):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 10075):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 10074):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 10073):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 10072):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 10071):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 10070):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 10069):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 10068):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 10067):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 10066):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 10065):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 10064):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 10063):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 10062):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 10061):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 10060):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 10059):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 10058):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 10057):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 10056):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 10055):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 10054):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 10053):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 10052):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 10051):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 10050):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 10049):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 10048):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 10047):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 10046):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 10045):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 10044):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 10043):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 10042):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 10041):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 10040):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 10039):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 10038):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 75 (LWP 10037):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 74 (LWP 10036):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 10035):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 10034):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 10033):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 10032):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x00000000000001d0 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055ae0ae3adb8 in ?? ()
#4 0x00007fc2f71db5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2f71db5f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 69 (LWP 10031):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x000000000000010f in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055ae0ae3ae3c in ?? ()
#4 0x00007fc2f79dc5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2f79dc5f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055ae0ae3ae28 in ?? ()
#9 0x00007fc3260cd770 in ?? ()
#10 0x00007fc2f79dc5f0 in ?? ()
#11 0x00007fc2f79dc650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 68 (LWP 10030):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 10029):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 10028):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 10027):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 10026):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 10025):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 10024):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 10023):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 10022):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 10021):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 10020):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 10019):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 10018):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000055ae0ae37238 in ?? ()
#4 0x00007fc2fe1e95d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2fe1e95f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 55 (LWP 10017):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 10016):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 10015):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 10014):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 10013):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 10012):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 10011):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 10010):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 10009):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 10008):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 10007):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 10006):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 10005):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 10004):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 10003):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 10002):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 10001):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 10000):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 9999):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 36 (LWP 9998):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055ae0ae3683c in ?? ()
#4 0x00007fc3081fd5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc3081fd5f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055ae0ae36828 in ?? ()
#9 0x00007fc3260cd770 in ?? ()
#10 0x00007fc3081fd5f0 in ?? ()
#11 0x00007fc3081fd650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 35 (LWP 9997):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 34 (LWP 9996):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 33 (LWP 9995):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055ae0ae3673c in ?? ()
#4 0x00007fc309a005d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc309a005f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055ae0ae36728 in ?? ()
#9 0x00007fc3260cd770 in ?? ()
#10 0x00007fc309a005f0 in ?? ()
#11 0x00007fc309a00650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 32 (LWP 9994):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000065 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000055ae0ae366bc in ?? ()
#4 0x00007fc30a2015d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc30a2015f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000055ae0ae366a8 in ?? ()
#9 0x00007fc3260cd770 in ?? ()
#10 0x00007fc30a2015f0 in ?? ()
#11 0x00007fc30a201650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 31 (LWP 9993):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 30 (LWP 9992):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 9991):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 9990):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 9989):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 9988):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 9987):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 9986):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 9985):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 9984):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 9983):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 9982):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 9981):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 9980):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 9979):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 9978):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 15 (LWP 9977):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055ae0a9a36c8 in ?? ()
#5 0x00007fc312a126a0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 9975):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 13 (LWP 9974):
#0 0x00007fc3260cdad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 9973):
#0 0x00007fc32421ea47 in ?? ()
#1 0x00007fc314215680 in ?? ()
#2 0x00007fc31f52a571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x000055ae0aaa0e58 in ?? ()
#5 0x00007fc3142156c0 in ?? ()
#6 0x00007fc314215840 in ?? ()
#7 0x000055ae0ab7b030 in ?? ()
#8 0x00007fc31f52c25d in ?? ()
#9 0x3fb95bea4379e000 in ?? ()
#10 0x000055ae0aa8d180 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae0aa8d180 in ?? ()
#13 0x000000000aaa0e58 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da2dd847e1f7cb in ?? ()
#16 0x000055ae0ab7b030 in ?? ()
#17 0x00007fc314215720 in ?? ()
#18 0x00007fc31f530ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb95bea4379e000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 9972):
#0 0x00007fc32421ea47 in ?? ()
#1 0x00007fc314a16680 in ?? ()
#2 0x00007fc31f52a571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x000055ae0aaa1a98 in ?? ()
#5 0x00007fc314a166c0 in ?? ()
#6 0x00007fc314a16840 in ?? ()
#7 0x000055ae0ab7b030 in ?? ()
#8 0x00007fc31f52c25d in ?? ()
#9 0x3fb965c6767c5000 in ?? ()
#10 0x000055ae0aa8cc00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae0aa8cc00 in ?? ()
#13 0x000000000aaa1a98 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da2dd847e1f7cc in ?? ()
#16 0x000055ae0ab7b030 in ?? ()
#17 0x00007fc314a16720 in ?? ()
#18 0x00007fc31f530ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb965c6767c5000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 9971):
#0 0x00007fc32421ea47 in ?? ()
#1 0x00007fc315217680 in ?? ()
#2 0x00007fc31f52a571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x000055ae0aaa1c58 in ?? ()
#5 0x00007fc3152176c0 in ?? ()
#6 0x00007fc315217840 in ?? ()
#7 0x000055ae0ab7b030 in ?? ()
#8 0x00007fc31f52c25d in ?? ()
#9 0x3fb955540d7e4000 in ?? ()
#10 0x000055ae0aa8c680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae0aa8c680 in ?? ()
#13 0x000000000aaa1c58 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da2dd847e1f7c9 in ?? ()
#16 0x000055ae0ab7b030 in ?? ()
#17 0x00007fc315217720 in ?? ()
#18 0x00007fc31f530ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb955540d7e4000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 9 (LWP 9970):
#0 0x00007fc32421ea47 in ?? ()
#1 0x00007fc315a18680 in ?? ()
#2 0x00007fc31f52a571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x000055ae0aaa1e18 in ?? ()
#5 0x00007fc315a186c0 in ?? ()
#6 0x00007fc315a18840 in ?? ()
#7 0x000055ae0ab7b030 in ?? ()
#8 0x00007fc31f52c25d in ?? ()
#9 0x3fb9771f41f81000 in ?? ()
#10 0x000055ae0aa8b600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae0aa8b600 in ?? ()
#13 0x000000000aaa1e18 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da2dd847e1f7cc in ?? ()
#16 0x000055ae0ab7b030 in ?? ()
#17 0x00007fc315a18720 in ?? ()
#18 0x00007fc31f530ba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 9967):
#0 0x00007fc324211cb9 in ?? ()
#1 0x00007fc31900b9c0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 9966):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 9965):
#0 0x00007fc3260d19e2 in ?? ()
#1 0x000055ae0a9bdee0 in ?? ()
#2 0x00007fc3180094d0 in ?? ()
#3 0x00007fc318009450 in ?? ()
#4 0x00007fc318009570 in ?? ()
#5 0x00007fc318009790 in ?? ()
#6 0x00007fc3180097a0 in ?? ()
#7 0x00007fc3180094e0 in ?? ()
#8 0x00007fc3180094d0 in ?? ()
#9 0x000055ae0a9bc350 in ?? ()
#10 0x00007fc3264bcc6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 9959):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000002c in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055ae0ab42dc8 in ?? ()
#5 0x00007fc31a00d430 in ?? ()
#6 0x0000000000000058 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 9958):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055ae0a9a2848 in ?? ()
#5 0x00007fc31a80e790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 9957):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055ae0a9a22a8 in ?? ()
#5 0x00007fc31b00f790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 9956):
#0 0x00007fc3260cdfb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000055ae0a9a2188 in ?? ()
#5 0x00007fc31b810790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 9953):
#0 0x00007fc3260d1d50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250902 21:33:04.447832 4307 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 3 with UUID 3a35f7f28cb9438dbcfb3196e167fdc5 and pid 9820
************************ BEGIN STACKS **************************
[New LWP 9821]
[New LWP 9822]
[New LWP 9823]
[New LWP 9824]
[New LWP 9830]
[New LWP 9831]
[New LWP 9832]
[New LWP 9835]
[New LWP 9836]
[New LWP 9837]
[New LWP 9838]
[New LWP 9839]
[New LWP 9840]
[New LWP 9842]
[New LWP 9843]
[New LWP 9844]
[New LWP 9845]
[New LWP 9846]
[New LWP 9847]
[New LWP 9848]
[New LWP 9849]
[New LWP 9850]
[New LWP 9851]
[New LWP 9852]
[New LWP 9853]
[New LWP 9854]
[New LWP 9855]
[New LWP 9856]
[New LWP 9857]
[New LWP 9858]
[New LWP 9859]
[New LWP 9860]
[New LWP 9861]
[New LWP 9862]
[New LWP 9863]
[New LWP 9864]
[New LWP 9865]
[New LWP 9866]
[New LWP 9867]
[New LWP 9868]
[New LWP 9869]
[New LWP 9870]
[New LWP 9871]
[New LWP 9872]
[New LWP 9873]
[New LWP 9874]
[New LWP 9875]
[New LWP 9876]
[New LWP 9877]
[New LWP 9878]
[New LWP 9879]
[New LWP 9880]
[New LWP 9881]
[New LWP 9882]
[New LWP 9883]
[New LWP 9884]
[New LWP 9885]
[New LWP 9886]
[New LWP 9887]
[New LWP 9888]
[New LWP 9889]
[New LWP 9890]
[New LWP 9891]
[New LWP 9892]
[New LWP 9893]
[New LWP 9894]
[New LWP 9895]
[New LWP 9896]
[New LWP 9897]
[New LWP 9898]
[New LWP 9899]
[New LWP 9900]
[New LWP 9901]
[New LWP 9902]
[New LWP 9903]
[New LWP 9904]
[New LWP 9905]
[New LWP 9906]
[New LWP 9907]
[New LWP 9908]
[New LWP 9909]
[New LWP 9910]
[New LWP 9911]
[New LWP 9912]
[New LWP 9913]
[New LWP 9914]
[New LWP 9915]
[New LWP 9916]
[New LWP 9917]
[New LWP 9918]
[New LWP 9919]
[New LWP 9920]
[New LWP 9921]
[New LWP 9922]
[New LWP 9923]
[New LWP 9924]
[New LWP 9925]
[New LWP 9926]
[New LWP 9927]
[New LWP 9928]
[New LWP 9929]
[New LWP 9930]
[New LWP 9931]
[New LWP 9932]
[New LWP 9933]
[New LWP 9934]
[New LWP 9935]
[New LWP 9936]
[New LWP 9937]
[New LWP 9938]
[New LWP 9939]
[New LWP 9940]
[New LWP 9941]
[New LWP 9942]
[New LWP 9943]
[New LWP 9944]
[New LWP 9945]
[New LWP 9946]
[New LWP 9947]
[New LWP 9948]
[New LWP 9949]
[New LWP 9950]
0x00007f8b19c7ad50 in ?? ()
Id Target Id Frame
* 1 LWP 9820 "kudu" 0x00007f8b19c7ad50 in ?? ()
2 LWP 9821 "kudu" 0x00007f8b19c76fb9 in ?? ()
3 LWP 9822 "kudu" 0x00007f8b19c76fb9 in ?? ()
4 LWP 9823 "kudu" 0x00007f8b19c76fb9 in ?? ()
5 LWP 9824 "kernel-watcher-" 0x00007f8b19c76fb9 in ?? ()
6 LWP 9830 "ntp client-9830" 0x00007f8b19c7a9e2 in ?? ()
7 LWP 9831 "file cache-evic" 0x00007f8b19c76fb9 in ?? ()
8 LWP 9832 "sq_acceptor" 0x00007f8b17dbacb9 in ?? ()
9 LWP 9835 "rpc reactor-983" 0x00007f8b17dc7a47 in ?? ()
10 LWP 9836 "rpc reactor-983" 0x00007f8b17dc7a47 in ?? ()
11 LWP 9837 "rpc reactor-983" 0x00007f8b17dc7a47 in ?? ()
12 LWP 9838 "rpc reactor-983" 0x00007f8b17dc7a47 in ?? ()
13 LWP 9839 "MaintenanceMgr " 0x00007f8b19c76ad3 in ?? ()
14 LWP 9840 "txn-status-mana" 0x00007f8b19c76fb9 in ?? ()
15 LWP 9842 "collect_and_rem" 0x00007f8b19c76fb9 in ?? ()
16 LWP 9843 "tc-session-exp-" 0x00007f8b19c76fb9 in ?? ()
17 LWP 9844 "rpc worker-9844" 0x00007f8b19c76ad3 in ?? ()
18 LWP 9845 "rpc worker-9845" 0x00007f8b19c76ad3 in ?? ()
19 LWP 9846 "rpc worker-9846" 0x00007f8b19c76ad3 in ?? ()
20 LWP 9847 "rpc worker-9847" 0x00007f8b19c76ad3 in ?? ()
21 LWP 9848 "rpc worker-9848" 0x00007f8b19c76ad3 in ?? ()
22 LWP 9849 "rpc worker-9849" 0x00007f8b19c76ad3 in ?? ()
23 LWP 9850 "rpc worker-9850" 0x00007f8b19c76ad3 in ?? ()
24 LWP 9851 "rpc worker-9851" 0x00007f8b19c76ad3 in ?? ()
25 LWP 9852 "rpc worker-9852" 0x00007f8b19c76ad3 in ?? ()
26 LWP 9853 "rpc worker-9853" 0x00007f8b19c76ad3 in ?? ()
27 LWP 9854 "rpc worker-9854" 0x00007f8b19c76ad3 in ?? ()
28 LWP 9855 "rpc worker-9855" 0x00007f8b19c76ad3 in ?? ()
29 LWP 9856 "rpc worker-9856" 0x00007f8b19c76ad3 in ?? ()
30 LWP 9857 "rpc worker-9857" 0x00007f8b19c76ad3 in ?? ()
31 LWP 9858 "rpc worker-9858" 0x00007f8b19c76ad3 in ?? ()
32 LWP 9859 "rpc worker-9859" 0x00007f8b19c76ad3 in ?? ()
33 LWP 9860 "rpc worker-9860" 0x00007f8b19c76ad3 in ?? ()
34 LWP 9861 "rpc worker-9861" 0x00007f8b19c76ad3 in ?? ()
35 LWP 9862 "rpc worker-9862" 0x00007f8b19c76ad3 in ?? ()
36 LWP 9863 "rpc worker-9863" 0x00007f8b19c76ad3 in ?? ()
37 LWP 9864 "rpc worker-9864" 0x00007f8b19c76ad3 in ?? ()
38 LWP 9865 "rpc worker-9865" 0x00007f8b19c76ad3 in ?? ()
39 LWP 9866 "rpc worker-9866" 0x00007f8b19c76ad3 in ?? ()
40 LWP 9867 "rpc worker-9867" 0x00007f8b19c76ad3 in ?? ()
41 LWP 9868 "rpc worker-9868" 0x00007f8b19c76ad3 in ?? ()
42 LWP 9869 "rpc worker-9869" 0x00007f8b19c76ad3 in ?? ()
43 LWP 9870 "rpc worker-9870" 0x00007f8b19c76ad3 in ?? ()
44 LWP 9871 "rpc worker-9871" 0x00007f8b19c76ad3 in ?? ()
45 LWP 9872 "rpc worker-9872" 0x00007f8b19c76ad3 in ?? ()
46 LWP 9873 "rpc worker-9873" 0x00007f8b19c76ad3 in ?? ()
47 LWP 9874 "rpc worker-9874" 0x00007f8b19c76ad3 in ?? ()
48 LWP 9875 "rpc worker-9875" 0x00007f8b19c76ad3 in ?? ()
49 LWP 9876 "rpc worker-9876" 0x00007f8b19c76ad3 in ?? ()
50 LWP 9877 "rpc worker-9877" 0x00007f8b19c76ad3 in ?? ()
51 LWP 9878 "rpc worker-9878" 0x00007f8b19c76ad3 in ?? ()
52 LWP 9879 "rpc worker-9879" 0x00007f8b19c76ad3 in ?? ()
53 LWP 9880 "rpc worker-9880" 0x00007f8b19c76ad3 in ?? ()
54 LWP 9881 "rpc worker-9881" 0x00007f8b19c76ad3 in ?? ()
55 LWP 9882 "rpc worker-9882" 0x00007f8b19c76ad3 in ?? ()
56 LWP 9883 "rpc worker-9883" 0x00007f8b19c76ad3 in ?? ()
57 LWP 9884 "rpc worker-9884" 0x00007f8b19c76ad3 in ?? ()
58 LWP 9885 "rpc worker-9885" 0x00007f8b19c76ad3 in ?? ()
59 LWP 9886 "rpc worker-9886" 0x00007f8b19c76ad3 in ?? ()
60 LWP 9887 "rpc worker-9887" 0x00007f8b19c76ad3 in ?? ()
61 LWP 9888 "rpc worker-9888" 0x00007f8b19c76ad3 in ?? ()
62 LWP 9889 "rpc worker-9889" 0x00007f8b19c76ad3 in ?? ()
63 LWP 9890 "rpc worker-9890" 0x00007f8b19c76ad3 in ?? ()
64 LWP 9891 "rpc worker-9891" 0x00007f8b19c76ad3 in ?? ()
65 LWP 9892 "rpc worker-9892" 0x00007f8b19c76ad3 in ?? ()
66 LWP 9893 "rpc worker-9893" 0x00007f8b19c76ad3 in ?? ()
67 LWP 9894 "rpc worker-9894" 0x00007f8b19c76ad3 in ?? ()
68 LWP 9895 "rpc worker-9895" 0x00007f8b19c76ad3 in ?? ()
69 LWP 9896 "rpc worker-9896" 0x00007f8b19c76ad3 in ?? ()
70 LWP 9897 "rpc worker-9897" 0x00007f8b19c76ad3 in ?? ()
71 LWP 9898 "rpc worker-9898" 0x00007f8b19c76ad3 in ?? ()
72 LWP 9899 "rpc worker-9899" 0x00007f8b19c76ad3 in ?? ()
73 LWP 9900 "rpc worker-9900" 0x00007f8b19c76ad3 in ?? ()
74 LWP 9901 "rpc worker-9901" 0x00007f8b19c76ad3 in ?? ()
75 LWP 9902 "rpc worker-9902" 0x00007f8b19c76ad3 in ?? ()
76 LWP 9903 "rpc worker-9903" 0x00007f8b19c76ad3 in ?? ()
77 LWP 9904 "rpc worker-9904" 0x00007f8b19c76ad3 in ?? ()
78 LWP 9905 "rpc worker-9905" 0x00007f8b19c76ad3 in ?? ()
79 LWP 9906 "rpc worker-9906" 0x00007f8b19c76ad3 in ?? ()
80 LWP 9907 "rpc worker-9907" 0x00007f8b19c76ad3 in ?? ()
81 LWP 9908 "rpc worker-9908" 0x00007f8b19c76ad3 in ?? ()
82 LWP 9909 "rpc worker-9909" 0x00007f8b19c76ad3 in ?? ()
83 LWP 9910 "rpc worker-9910" 0x00007f8b19c76ad3 in ?? ()
84 LWP 9911 "rpc worker-9911" 0x00007f8b19c76ad3 in ?? ()
85 LWP 9912 "rpc worker-9912" 0x00007f8b19c76ad3 in ?? ()
86 LWP 9913 "rpc worker-9913" 0x00007f8b19c76ad3 in ?? ()
87 LWP 9914 "rpc worker-9914" 0x00007f8b19c76ad3 in ?? ()
88 LWP 9915 "rpc worker-9915" 0x00007f8b19c76ad3 in ?? ()
89 LWP 9916 "rpc worker-9916" 0x00007f8b19c76ad3 in ?? ()
90 LWP 9917 "rpc worker-9917" 0x00007f8b19c76ad3 in ?? ()
91 LWP 9918 "rpc worker-9918" 0x00007f8b19c76ad3 in ?? ()
92 LWP 9919 "rpc worker-9919" 0x00007f8b19c76ad3 in ?? ()
93 LWP 9920 "rpc worker-9920" 0x00007f8b19c76ad3 in ?? ()
94 LWP 9921 "rpc worker-9921" 0x00007f8b19c76ad3 in ?? ()
95 LWP 9922 "rpc worker-9922" 0x00007f8b19c76ad3 in ?? ()
96 LWP 9923 "rpc worker-9923" 0x00007f8b19c76ad3 in ?? ()
97 LWP 9924 "rpc worker-9924" 0x00007f8b19c76ad3 in ?? ()
98 LWP 9925 "rpc worker-9925" 0x00007f8b19c76ad3 in ?? ()
99 LWP 9926 "rpc worker-9926" 0x00007f8b19c76ad3 in ?? ()
100 LWP 9927 "rpc worker-9927" 0x00007f8b19c76ad3 in ?? ()
101 LWP 9928 "rpc worker-9928" 0x00007f8b19c76ad3 in ?? ()
102 LWP 9929 "rpc worker-9929" 0x00007f8b19c76ad3 in ?? ()
103 LWP 9930 "rpc worker-9930" 0x00007f8b19c76ad3 in ?? ()
104 LWP 9931 "rpc worker-9931" 0x00007f8b19c76ad3 in ?? ()
105 LWP 9932 "rpc worker-9932" 0x00007f8b19c76ad3 in ?? ()
106 LWP 9933 "rpc worker-9933" 0x00007f8b19c76ad3 in ?? ()
107 LWP 9934 "rpc worker-9934" 0x00007f8b19c76ad3 in ?? ()
108 LWP 9935 "rpc worker-9935" 0x00007f8b19c76ad3 in ?? ()
109 LWP 9936 "rpc worker-9936" 0x00007f8b19c76ad3 in ?? ()
110 LWP 9937 "rpc worker-9937" 0x00007f8b19c76ad3 in ?? ()
111 LWP 9938 "rpc worker-9938" 0x00007f8b19c76ad3 in ?? ()
112 LWP 9939 "rpc worker-9939" 0x00007f8b19c76ad3 in ?? ()
113 LWP 9940 "rpc worker-9940" 0x00007f8b19c76ad3 in ?? ()
114 LWP 9941 "rpc worker-9941" 0x00007f8b19c76ad3 in ?? ()
115 LWP 9942 "rpc worker-9942" 0x00007f8b19c76ad3 in ?? ()
116 LWP 9943 "rpc worker-9943" 0x00007f8b19c76ad3 in ?? ()
117 LWP 9944 "diag-logger-994" 0x00007f8b19c76fb9 in ?? ()
118 LWP 9945 "result-tracker-" 0x00007f8b19c76fb9 in ?? ()
119 LWP 9946 "excess-log-dele" 0x00007f8b19c76fb9 in ?? ()
120 LWP 9947 "tcmalloc-memory" 0x00007f8b19c76fb9 in ?? ()
121 LWP 9948 "acceptor-9948" 0x00007f8b17dc90c7 in ?? ()
122 LWP 9949 "heartbeat-9949" 0x00007f8b19c76fb9 in ?? ()
123 LWP 9950 "maintenance_sch" 0x00007f8b19c76fb9 in ?? ()
Thread 123 (LWP 9950):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000026 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056145d385e50 in ?? ()
#5 0x00007f8ad054f470 in ?? ()
#6 0x000000000000004c in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 9949):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000000b in ?? ()
#3 0x0000000100000081 in ?? ()
#4 0x000056145d2ff934 in ?? ()
#5 0x00007f8ad0d503f0 in ?? ()
#6 0x0000000000000017 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x00007f8ad0d50410 in ?? ()
#9 0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007f8ad0d50470 in ?? ()
#12 0x00007f8b19918a11 in ?? ()
#13 0x0000000000000000 in ?? ()
Thread 121 (LWP 9948):
#0 0x00007f8b17dc90c7 in ?? ()
#1 0x00007f8ad1551010 in ?? ()
#2 0x00007f8b199292f2 in ?? ()
#3 0x00007f8ad1551010 in ?? ()
#4 0x0000000000080800 in ?? ()
#5 0x00007f8ad15513d0 in ?? ()
#6 0x00007f8ad1551080 in ?? ()
#7 0x000056145d2aac48 in ?? ()
#8 0x00007f8b1992ed89 in ?? ()
#9 0x00007f8ad1551500 in ?? ()
#10 0x00007f8ad1551700 in ?? ()
#11 0x00000080d15513c0 in ?? ()
#12 0x00007f8b1b1938ca in ?? () from /lib64/ld-linux-x86-64.so.2
Backtrace stopped: previous frame inner to this frame (corrupt stack?)
Thread 120 (LWP 9947):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffce514a620 in ?? ()
#5 0x00007f8ad1d52670 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 9946):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 118 (LWP 9945):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000009 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056145d233b70 in ?? ()
#5 0x00007f8ad2d54680 in ?? ()
#6 0x0000000000000012 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 9944):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000009 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056145d5b5390 in ?? ()
#5 0x00007f8ad3555550 in ?? ()
#6 0x0000000000000012 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 116 (LWP 9943):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 115 (LWP 9942):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 114 (LWP 9941):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 9940):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 9939):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 9938):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 9937):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 9936):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 9935):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 9934):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 9933):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 9932):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 9931):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 9930):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 9929):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 9928):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 9927):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 9926):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 9925):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000008 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000056145d589738 in ?? ()
#4 0x00007f8adcd685d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8adcd685f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 97 (LWP 9924):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 9923):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 9922):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 9921):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 9920):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 9919):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 9918):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 9917):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 9916):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 9915):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 9914):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 9913):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 9912):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 9911):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 9910):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 9909):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 9908):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 9907):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 9906):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 9905):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 9904):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 76 (LWP 9903):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x00000000000003aa in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000056145d573bb8 in ?? ()
#4 0x00007f8ae7d7e5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8ae7d7e5f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 75 (LWP 9902):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x000000000000023d in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000056145d573b3c in ?? ()
#4 0x00007f8ae857f5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8ae857f5f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000056145d573b28 in ?? ()
#9 0x00007f8b19c76770 in ?? ()
#10 0x00007f8ae857f5f0 in ?? ()
#11 0x00007f8ae857f650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 74 (LWP 9901):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 9900):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 9899):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 9898):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 9897):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 9896):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 9895):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 9894):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 9893):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 9892):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 9891):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 9890):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 9889):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 9888):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 9887):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 9886):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 9885):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 9884):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 56 (LWP 9883):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000056145d5730b8 in ?? ()
#4 0x00007f8af1d925d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8af1d925f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 55 (LWP 9882):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 9881):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 9880):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 9879):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 9878):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 9877):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 9876):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 9875):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 9874):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 9873):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 9872):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 9871):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 9870):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 9869):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 9868):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 9867):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 9866):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 9865):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 9864):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 36 (LWP 9863):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 35 (LWP 9862):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 34 (LWP 9861):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 33 (LWP 9860):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 32 (LWP 9859):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 31 (LWP 9858):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000013 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x000056145d5890bc in ?? ()
#4 0x00007f8afe5ab5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8afe5ab5f0 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000056145d5890a8 in ?? ()
#9 0x00007f8b19c76770 in ?? ()
#10 0x00007f8afe5ab5f0 in ?? ()
#11 0x00007f8afe5ab650 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 30 (LWP 9857):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 9856):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 9855):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 9854):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x000000000000005e in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x000056145d589138 in ?? ()
#4 0x00007f8b005af5d0 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f8b005af5f0 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 26 (LWP 9853):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 9852):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 9851):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 9850):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 9849):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 9848):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 9847):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 9846):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 9845):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 9844):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 16 (LWP 9843):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 15 (LWP 9842):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056145d2196c8 in ?? ()
#5 0x00007f8b065bb6a0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 9840):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 13 (LWP 9839):
#0 0x00007f8b19c76ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 9838):
#0 0x00007f8b17dc7a47 in ?? ()
#1 0x00007f8b085bf680 in ?? ()
#2 0x00007f8b130d3571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x000056145d316e58 in ?? ()
#5 0x00007f8b085bf6c0 in ?? ()
#6 0x00007f8b085bf840 in ?? ()
#7 0x000056145d3f1030 in ?? ()
#8 0x00007f8b130d525d in ?? ()
#9 0x3fb979fca82b8000 in ?? ()
#10 0x000056145d303180 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056145d303180 in ?? ()
#13 0x000000005d316e58 in ?? ()
#14 0x0000561400000000 in ?? ()
#15 0x41da2dd847e1f7ca in ?? ()
#16 0x000056145d3f1030 in ?? ()
#17 0x00007f8b085bf720 in ?? ()
#18 0x00007f8b130d9ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb979fca82b8000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 11 (LWP 9837):
#0 0x00007f8b17dc7a47 in ?? ()
#1 0x00007f8b08dc0680 in ?? ()
#2 0x00007f8b130d3571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x000056145d317a98 in ?? ()
#5 0x00007f8b08dc06c0 in ?? ()
#6 0x00007f8b08dc0840 in ?? ()
#7 0x000056145d3f1030 in ?? ()
#8 0x00007f8b130d525d in ?? ()
#9 0x3fb9781a2934a000 in ?? ()
#10 0x000056145d302c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056145d302c00 in ?? ()
#13 0x000000005d317a98 in ?? ()
#14 0x0000561400000000 in ?? ()
#15 0x41da2dd847e1f7cc in ?? ()
#16 0x000056145d3f1030 in ?? ()
#17 0x00007f8b08dc0720 in ?? ()
#18 0x00007f8b130d9ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb9781a2934a000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 10 (LWP 9836):
#0 0x00007f8b17dc7a47 in ?? ()
#1 0x00007f8b095c1680 in ?? ()
#2 0x00007f8b130d3571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x000056145d317c58 in ?? ()
#5 0x00007f8b095c16c0 in ?? ()
#6 0x00007f8b095c1840 in ?? ()
#7 0x000056145d3f1030 in ?? ()
#8 0x00007f8b130d525d in ?? ()
#9 0x3fb5874e28038000 in ?? ()
#10 0x000056145d302100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056145d302100 in ?? ()
#13 0x000000005d317c58 in ?? ()
#14 0x0000561400000000 in ?? ()
#15 0x41da2dd847e1f7cc in ?? ()
#16 0x000056145d3f1030 in ?? ()
#17 0x00007f8b095c1720 in ?? ()
#18 0x00007f8b130d9ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb5874e28038000 in ?? ()
#21 0x0000000000000000 in ?? ()
Thread 9 (LWP 9835):
#0 0x00007f8b17dc7a47 in ?? ()
#1 0x00007f8b0b3b1680 in ?? ()
#2 0x00007f8b130d3571 in ?? ()
#3 0x00000000000000cb in ?? ()
#4 0x000056145d317e18 in ?? ()
#5 0x00007f8b0b3b16c0 in ?? ()
#6 0x00007f8b0b3b1840 in ?? ()
#7 0x000056145d3f1030 in ?? ()
#8 0x00007f8b130d525d in ?? ()
#9 0x3fb984be5e551000 in ?? ()
#10 0x000056145d302680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056145d302680 in ?? ()
#13 0x000000005d317e18 in ?? ()
#14 0x0000561400000000 in ?? ()
#15 0x41da2dd847e1f7ca in ?? ()
#16 0x000056145d3f1030 in ?? ()
#17 0x00007f8b0b3b1720 in ?? ()
#18 0x00007f8b130d9ba3 in ?? ()
#19 0x0000000000000000 in ?? ()
Thread 8 (LWP 9832):
#0 0x00007f8b17dbacb9 in ?? ()
#1 0x00007f8b0cbb49c0 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 7 (LWP 9831):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000000 in ?? ()
Thread 6 (LWP 9830):
#0 0x00007f8b19c7a9e2 in ?? ()
#1 0x000056145d233ee0 in ?? ()
#2 0x00007f8b0bbb24d0 in ?? ()
#3 0x00007f8b0bbb2450 in ?? ()
#4 0x00007f8b0bbb2570 in ?? ()
#5 0x00007f8b0bbb2790 in ?? ()
#6 0x00007f8b0bbb27a0 in ?? ()
#7 0x00007f8b0bbb24e0 in ?? ()
#8 0x00007f8b0bbb24d0 in ?? ()
#9 0x000056145d232350 in ?? ()
#10 0x00007f8b1a065c6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 5 (LWP 9824):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x000000000000002f in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056145d3b8dc8 in ?? ()
#5 0x00007f8b0dbb6430 in ?? ()
#6 0x000000000000005e in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 4 (LWP 9823):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056145d218848 in ?? ()
#5 0x00007f8b0e3b7790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 3 (LWP 9822):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056145d2182a8 in ?? ()
#5 0x00007f8b0ebb8790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 2 (LWP 9821):
#0 0x00007f8b19c76fb9 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000000000002 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x000056145d218188 in ?? ()
#5 0x00007f8b0f3b9790 in ?? ()
#6 0x0000000000000004 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 1 (LWP 9820):
#0 0x00007f8b19c7ad50 in ?? ()
#1 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250902 21:33:04.937927 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 9689
I20250902 21:33:04.942813 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 9554
I20250902 21:33:04.955288 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 9953
I20250902 21:33:04.967988 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 9820
I20250902 21:33:04.980199 4307 external_mini_cluster.cc:1658] Killing /tmp/dist-test-task6wlXYv/build/release/bin/kudu with pid 5468
2025-09-02T21:33:04Z chronyd exiting
I20250902 21:33:04.996601 4307 test_util.cc:183] -----------------------------------------------
I20250902 21:33:04.996670 4307 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-task6wlXYv/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1756848720918944-4307-0
[ FAILED ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-A0 CC-5D D9-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-A0 CC-5D D9-55 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-A2 CC-5D D9-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00> (48686 ms)
[----------] 1 test from RollingRestartArgs/RollingRestartITest (48686 ms total)
[----------] Global test environment tear-down
[==========] 2 tests from 2 test suites ran. (64073 ms total)
[ PASSED ] 1 test.
[ FAILED ] 1 test, listed below:
[ FAILED ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-A0 CC-5D D9-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-A0 CC-5D D9-55 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-A2 CC-5D D9-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00>
1 FAILED TEST
I20250902 21:33:04.997179 4307 logging.cc:424] LogThrottler /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/client/meta_cache.cc:302: suppressed but not reported on 17 messages since previous log ~9 seconds ago